Thats actually a really good dilemma if you think about it. Like if everyone doubles it you basically don’t kill anyone. But you’ll always risk that there’s some psycho who likes killing and then you will have killed more. And if these choices continue endlessly you will eventually find someone like this. So killing immediately should be the right thing to do.
Now, what if you’re not the first person on the chain? What if you’re the second one. Or the n one? What now? Would you kill two or n knowing that the person before you spared them?
Eventually there might also be a track with no people on it so postponing the dilemma becomes much better than at least 1 death. But there is no way of knowing what the future dilemma might be.
That leads to another interesting split path. Maybe it’s best to just kill the one right away. Assuming this goes on forever, it’s basically inevitable that someone somehow will end up killing an obscene number of people eventually. But maybe it’d be like nukes, and eventually reach a point where flipping the lever is just mutually assured destruction, and no one would ever actually do that
Assuming of course that it goes on forever. Which admittedly seems like what one is intended to think, but the graphic doesn't actually show or state that, and realistically, if actually given this scenario, it shouldn't, because eventually some limit will be encountered that makes it impossible for the problem to physically exist (like running out of people to tie to the tracks, running out of space for them, having such a large amount of stuff in one space that it undergoes gravitational collapse, the finite size of the observable universe making fitting an infinite dilemma impossible, etc.)
People always miss the bigger picture with these things. Why do these trolleys' brakes keep failing? Is it a design flaw in the braking system? Is the maintenance crew severely underfunded? Is it a slippage problem due to improper rail maintenance? It's a shame we can't even organize a work stoppage to sort this out since congress blocked the trolley union from striking...
this is not a purely theoretical question. in practice, autonomous vehicles face exactly this dilemma. or rather the manufacturers of the vehicles who have to set the specifications
I forget where it was from but years ago I found an online survey on autonomous cars and their decision making from a university. It was all about deciding to swerve or not in a collision. All kinds of difficult encounters like do you hit the barrier and kill the passenger or swerve and kill the old lady? Do you hit thin person or serve and hit the heavier person?
I've never seen a survey drill down into biases quite so deeply.
From what I've seen of real world examples, not "what if the car had 5 cats in it and the person on the crosswalk had a stroller full of 6 cat, swerve into a barricade?", telsa cars just release control of the autonomous controls to the person behind the wheel a few seconds before impact so the driver is fully liable.
But what happens when you get to, say, the 34th person, and there are 2^33 people tied up, more than there are living humans in the world? Pass the buck, break the simulation, save the world
I think with this scenario it's indirectly caused by you. Either you 'press a button,' directly resulting in the death of a specific individual, or another person is given the same scenario but the button directly causes double the number of deaths if they press it.
Switch the track from the bottom to the top as the train is half way over the switch, causing the train to drift across both rails hitting all three tied up people and the second switch operator.
Depends on if you're happy with someone else killing lot more people, or if you want to kill someone yourself.
Assuming this goes to infinity, the reasonable thing to do is to kill one person to prevent someone else killing a lot of people. But that would make you directly responsible for killing that person.
You would need a crazy low probability of a lunatic or a mass murderer being down the line to justify not to kill one person
Edit: Sum(2^n (1-p)^(n-1) p) ~ Sum(2^n p) for p small. So you'd need a p= (2×2^32 -2) ~ 1/(8 billion) chance of catching a psycho for expected values to be equal. I.e. there is only a single person tops who would decide to kill all on earth.
You don't even need a lunatic or mass murderer. As you say, the logical choice is to kill one person. For the next person, the logical choice is to kill two people, and so on.
It does create the funny paradox where, up to a certain point, a rational utilitarian would choose to kill and a rational mass murderer trying to maximise deaths would choose to double it.
Attempting to subvert the thought experiment only makes things worse. The trolley is full of child prodigies, all future geniuses that will cure cancer and solve the world's problems. By sticking the lever halfway you kill all of them. The only way to save the child prodigies is to choose, left or right.
You couldn't even bother putting in adult scientists that have already helped the world. It's a hypothetical scenario, you know, you can put in anyone you want. So I'm putting the child prodigies to a test by having the save themselves from the half-lever. Should be relatively easy for them.
I think everyone here is missing the real answer. If you look at the picture you will notice a third option, there are track switches, two of them, you can bypass the people tied to the track, then kill the monster forcing you to kill for no reason.