Extended trolley problem
I just ran across this article, Should a self-driving car kill its passengers in a "greater good" scenario? Though the article does not say so, it is the trolley problem, but with a twist: You are the driver of the trolley, and you have to ask whether you ought to be sacrificed for the greater good. That is, there are now three possibilities, not two: do nothing and kill five people; swerve and kill one; or (the added possibility) swerve and kill yourself. Any thoughts?
33 Comments
eric · 28 October 2015
The strictures on the trolley problem are so completely unrealistic as to render it a meaningless question (IMO) in terms of real ethical issues. We never have certainty about what will happen. And we don't judge people on pure outcome, but also on what they could realistically have known or estimated as to outcome as well as the time they had to make a decision (limited time decisions tend to be assigned less moral culpability; since the trolley problem gives you essentially no time to make a decision - if there were lots of time, the people would move out of the way rendering the problem void - IMO if it really happened, we would not assign *any* moral culpability to the driver *regardless* of their decision).
However in the spirit of stupid 'truth or dare' type questions, my answer is: before I had kids, #3. Now that I have them, #2.
John Harshman · 28 October 2015
What is my degree of relatedness to the people who might be killed? Clearly I must act to maximize my inclusive fitness. Actually, we must consider both relatedness and expected future reproductive output (counting any remaining parental care).
There, back on topic for Panda's Thumb.
Marilyn · 29 October 2015
Everyone should be aware they are built to save the occupants.
DS · 29 October 2015
Everyone should be aware that they are evolved, not built.
Kevin B · 29 October 2015
Just Bob · 29 October 2015
What... isn't it egg cartons?
John Harshman · 29 October 2015
Matt Young · 29 October 2015
Michael Fugate · 29 October 2015
I always opt swerve and everyone survives.
Otherwise doing nothing is easier to justify.
eric · 29 October 2015
TomS · 29 October 2015
One thing to take into account is the capability of the other actors in the case.
I mean that I can expect that some others can take care of themselves, so I can focus my attention on those who cannot. Children rather than adults; pedestrians and bicyclists rather than motorized vehicles; those who are unaware of the danger and those whose options are limited merit more care on my part.
I think that the rules of the sea and aircraft take account of something like that. Those craft which are less maneuverable have a hiigher priority.
Also, I think that doing nothing, or at least, making no unpredictable changes, allows others to react to what you are doing.
Marilyn · 29 October 2015
It, that is the AV, should not be travelling a corner that fast that it could not stop. They should be built... robust and have proper sensors and warning signals and if someone is going to be at the wheel they should be able to override the automatic. The same situation could occur in an ordinary car then it would be the driver who was responsible for the decision not the computer program that was programmed by a person. Cars now have sensors that enable them to park themselves safely. They shouldn't be aloud to swerve they should be made to stop or fly or something.
John Harshman · 29 October 2015
Matt Young · 29 October 2015
Pierce R. Butler · 29 October 2015
The first time I heard the trolley car problem, it had already been (further) elaborated to specify that "you" were physically too small to make a difference by throwing yourself on the rails, but were somehow strong enough to knock down a conveniently located fat man with enough bulk to save the day.
In terms of idealistic armchair morality, self-sacrificing heroes always "win".
Marilyn · 31 October 2015
DS · 31 October 2015
harold · 1 November 2015
Kevin B · 2 November 2015
blueindy1 · 2 November 2015
As long as we're engaging in a kind of "Kobayashi Maru" no-win scenario, simply program the vehicle upon entering such a situation to eject the driver and any passengers upwards, away from the vehicle and, upon ejection, the car swiftly sef-destructs directing as much of the blast as possible downward and away from anybody in danger.
There might still be some injuries and even some death, but nothing like the wholesale destruction of a full vehicle impact on a crowd of people.
Matt Young · 2 November 2015
harold · 2 November 2015
Matt Young · 2 November 2015
Just Bob · 2 November 2015
Legal worms in the can: Will the "driver" of a self-driving car be liable for any damages, just as though he were driving 'hands-on'? IOW, must he be as alert to road conditions, hazards, and the actions of other drivers and pedestrians as he would be if he were driving himself? Or will he be allowed some 'slack', some allowance for inattention while the car is driving itself? If he is expected, legally required, to be as alert and instantly ready to take over if danger looms, then it would seem he would be taking on a greater risk by going 'hands-off', thus allowing his attention to wander from the immediate mechanics of driving.
Alternatively, if statistics show automatic control is safer, will a driver who maintains control himself, and gets into an accident, be held liable for NOT going automatic?
harold · 2 November 2015
harold · 2 November 2015
Marilyn · 3 November 2015
Perhaps there should be more than one option to chose from at the start of the journey. If you are by yourself choose a switch of your preference, if you have more people in the car chose an alternative switch. Hopefully something of more importance than just a mood of the day switch. But legal and insurance people should play their part. There is also an element of possibility that the least number of people swerved into would not be fatally harmed, where as the people in the car would be if evasive manoeuvres weren't done. These cars should be made safe enough for a person to even sleep in as they are travelling, for them to be futuristic and sci fi.
eric · 3 November 2015
DS · 3 November 2015
I think you should read the bible, trying to find the answer, until it is too late to do anything. What? that seems to be the response to climate change.
Marilyn · 5 November 2015
They seem to be making more of a headway with the environment issue if they make these cars solar powered or use hydrogen cell. If they are safe, apart from this last problem that has to be sorted out accurately, they have made a good progress with that too. If the pedestrian was to carry a bleeper signal for the car to pick up and so know if there are pedestrians around a corner that would help.
DS · 5 November 2015
Yes, if only we didn't have to make any moral choices. Life would be so much simpler. We certainly wouldn't need religion.
Kevin Kirkpatrick · 5 November 2015
My Freshman year in college. My roommate leaves for class. 15 minutes goes by. The prior night, my roommate had received "care package" from his parents, including a large bag of M&Ms, which he has put in a bowl on his desk.
My line of reasoning (okay, not really a line so much as cumulative series of justifications):
1) His class is 50 minutes, and on other side of campus.
2) He probably has thousands of M&Ms there - he won't miss a few.
3) I'm hungry, but studying for quiz that's in < 1 hour, and don't have time to get anything from rec room.
4) I'll get a crappy grade if I'm trying to study on a sugar-low.
5) The benefit I'd receive from a handful of M&Ms far outweighs any harm he'd receive.
6) I gave him a pen two days ago; a handful of M&s is about the same value... in a way, this balances out.
Notably missing from the list:
* Any indication that he's cool with me helping myself to his shit.
After a couple minutes of hemming and hawing, and checking/double-checking my math on the odds, "knowing" I'd get away with it, I made my choice.
So I stand up, walk over, and grab a *small* handful of M&Ms. Literally in the 1-2 second "crucial interval" of "clandestine scoopage", the door flies open, and my roommate rushes in to grab a forgotten homework assignment due that day. Only to see me, with deer-in-headlights look on my face, stealing his M&Ms.
Suffice to say, he was in too much of a hurry to talk at the time. My hurried "explanation" was, at best, incoherent. The subject never came up again, directly, but we had a much colder and distrustful relationship from that point forward. His friends became likewise cold and distant. Unbeknownst to me, he requested a transfer at the semester, and my second semester was spent sans-roommate, with dorm-wide reputation: untrustworthy; will go through and steal your shit.
Time heals wounds and, thankfully, reputations. I can't say I felt any impact of this event beyond my Freshman year. But I took away a deep moral lesson: there are no certainties. There is no knowing. My moral code, you might say, grew 10 times that year. To this day, the moral choices I make are *never* allowed to factor in, "... assuming I won't get caught."
Of course, my moral code is foundationally empathy-based: most of the M&M rationalizations were empathy-based (no harm, no foul); hence my roommate caught me with my hand in the candy-jar, not sifting through his wallet. But the matured moral code I now live by [i.e. the moral code which led me to cut short a "free" version of Jurasssic Park my 10-year-old found online last night with a stern lecture, removal of $3 from his allowance, and $2.99 Amazon.com rental] has the fundamental principle "You can't know everything; you can't know you'll 'get away' with something" baked into it. If you ask me "If you know you'd get away with it...would you do X?", what I really feel I'm answering is, "If you had a moral code that was fundamentally different from the one you have now, would you do X?"
Not much different than, "Imagine you were a Nazi soldier in 1941 and honestly believed Jews were inhuman monsters intent on wiping you and everyone you love off the planet - would you lead a group of them into the gas chamber?" WTF would any "Yes" or "No" answer to that tell you about me? Beyond, "how lucidly can you assess hypothetical conditions?"
The "trolley problem" and uncountable philosophical/moral "puzzles" of that flavor have the same pitfall - generally speaking the underlying assumptions undercut the very same assumptions that are intrinsic to the aspect of my personality that's being probed...
Marilyn · 7 November 2015
Though with hydrogen cells there may be more fog, leading to more clouds and more rain, could end up needing an ark......