The Real Life Trolley Problem

I was lucky enough to stumble upon this gem today:

You see I’ve always loved the Trolley Dilemma (do you pull the lever and make the train kill one person, if its already careening towards five?) not ‘loved’ loved but I enjoy thinking about right and wrong and how it all fits together.

The modern driver-less car creates a real life trolley dilemma. The cars must be programmed to make decisions precisely like the 1 vs five and much worse. Its delightfully ghoulish and fascinating and insightful all at once. Apparently I’m heavily biased towards humans over dogs (that I knew) abiding by the law (I’m actually not a huge law-abiding kinda guy I’m just lucky that NZ law and my ethics match up) and bigger people over slim (did not see that coming considering I myself am a scarecrow)

But what really gets me about driver-less ethics is that it touches on so many biases. Automatic cars have been shown to be far safer than human drivers (statistically at least) yet any crash via a machine driving has wounded the perception of them (far more than an given crash leading people to believe humans are unsafe drivers). I confess I am just as guilty of this myself, the thought of a driver-less car crashing into me feels almost morally reprehensible even thought I know its a combination of naturalistic, and responsibility fallacies (i.e. believing what is natural is right, cos cars you know so natural, and preferring wrong-doing that has someone you can target with blame)

The fallacy I really enjoy rolling through my head though I don’t know if there is a specific name for it, but the kind of ‘hands-off’ ideal (not hands of the wheel) but where we prefer to not make judgement calls about this sort of thing, like who should driver-less cars crash into, even though such a call is and must be made, its kinda like people who trust their government to conduct military and clandestine operations.


Try out the Moral Machine test yourself, and don’t hesitate to leave your thoughts, would love to hear more moral conundrums.



2 thoughts on “The Real Life Trolley Problem

  1. Pingback: Twin Medics – Thought Attempts

  2. I didn’t bother finding out what they thought about my choices (too much like guesswork given the small number of scenarios), but I noticed a couple of interesting things about myself.

    1) I’m more likely to sacrifice those who are taking a risk (e.g. crossing against the lights)

    2) I’m uncomfortable with any sort of judgement based on type of people. I can accept basing it on numbers (e.g. sacrifice 3 rather than 5), but not on age/gender/health/profession/etc. Partly because how can the automated car tell whether someone is a criminal? An athlete? A parent? A child?


Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s