IEET > Rights > HealthLongevity > GlobalDemocracySecurity > Vision > Artificial Intelligence > Cyber
IEET Readers Divided on Robot Cars That Sacrifice Drivers’ Lives
Jun 14, 2014  

Intrigued by IEET Fellow Patrick Lin’s essay “The Ethics of Autonomous Cars” we asked “Should your robot car sacrifice your life if it will save more lives?” A third of of the 196 of you who responded said no, a third said they should and a third said it should be the driver’s option.

Of course, we can hope there will be far fewer automotive accidents with robot cars in the first place. But it will be fascinating to see how this software and regulation gets written. Will libertarian Americans opt for choice, or save-the-driver as a default, while more social democratic countries implement save-the-most driver software?

Our new poll is on whether the Turing Test is useful or bollocks.


I chose the first one simply because of the fact that cars are not designed to protect other people, for whatever reason, but rather to protect those within the car. If a car is programmed to protect others at the expense of its passengers, we’re left with an ethical dilemma of determining “which lives are more important?”, “how many in the minority left behind should there be before being considered too much _to_ leave behind?”, and “what if the car was wrong and no lives were needing to be saved at the expense of its passengers?”

With each passenger being protected by its own car, we’re at least avoiding those kinds of dilemmas, or perhaps alleviating them to some degree. When every car is programmed to protect a limited select group of people, the roads will be much safer because they’ll collectively avoid dangers and risks at all times to themselves, rather than trying to “grab someone else’s steering wheel,” per se.

So looking back at these options, the first one is the only one that makes sense. Everything balances itself out when each car is defending itself and its passenger(s) as meticulous as possible. The 2nd option is a stupid option, IMO, because that’s EXACTLY what people are doing right now with the poll. Shouldn’t even be an option. And the 3rd and final option would result in more accidents than it saves, because it’s no longer determining the safety of any particular person, but others at random at the expense of those inside. A car doesn’t need to be a hero. It just needs to do its damn job.

There are, of course, some people who believe that cars should make the decision of saving someone important over someone who’s wasting away their lives. Assuming we’re even capable of programming a computer to achieve that kind of thought processing, what then?

Using that very “logic,” the car would make the decision that a progressive politician is more important to save than a car of junkies. The problem with this is that it’s an observation of what is and not what it could be! What if that politician becomes corrupt and those junkies get clean?

Is the car’s decision to spare the politician then still a positive decision? I wouldn’t think so. I’d leave it to where each car merely protects its own passengers. That way everything will balance itself out in the end.

The problem with what you’re saying B.J. is it entirely disregards pedestrians and other road users - it’s all very well for every car to protect its own occupants, but what about people not in a car?

I would think it is fair for a self-driving car to prioritise its own occupants over other cars, but it should prioritise road-users not protected by crumple zones, air bags &c above anything else.

That isn’t what’s in question, however. Of course these cars are going to look out for any kind of pedestrian walking around, near or on roads. But then pedestrians on a road wouldn’t result in the passenger’s death - just the possible death of the pedestrians.

Whereas, when a robot car detects objects in front of it, its programmed goal is to protect the passenger, thus to either stop or dodge the objects. If it stops, then no one is harmed. If it tries dodging, however, then other nearby robot cars will become involved. How then does your car and other cars react? Should they try saving their own passengers, or others driving by? The former would result in a collective balancing act, whereas the latter would result in non-predictive chaos.

YOUR COMMENT Login or Register to post a comment.

Next entry: Why the Turing Test is Bullshit

Previous entry: Slow unSingularity – so what? Reply to George Dvorsky and Ramez Naam