
Which would you choose?
Most people answered that question in two different ways, depending on how the question was asked. When it was asked in the abstract — should a driver-less car be programmed to minimize deaths —they said yes. But different results emerged when asked if they would want to own a car that did that at the expense of their personal safety.
“Technology has a proven track record of saving lives. We may be on the cusp of a safety innovation revolution,” said NHTSA (National Highway Transportation & Safety Administration) Administrator Mark Rosekind. The moral dilemma shows how difficult some of the decisions to be made will be. “Our first — and really our only — concern is safety.”
While safety is certainly a top concern for every stakeholder, what happens when personal safety runs in conflict with societal safety. So the question lingers: What happens when the greater good is at odds with personal safety? That takes us back to the study.

76% believed autonomous (driver-less) vehicles should be programmed to be “utilitarian.” In other words, they should be programmed to save the most lives (in the example above that would mean the pedestrians) while sacrificing as few lives as possible (in this case, the driver), according to the study’s authors.
The group believed it to be more “moral” to sacrifice one passenger rather than 10 pedestrians.
“We were surprised that so many people expressed a strong moral preference for cars that would kill them, as passengers, for the greater good,” Jean-Francois Bonnefon, a psychological scientist at the Toulouse School of Economics in France, co-author of the study to CNN
Unless you happen to be the passenger
If you’re the passenger, though, the results change. In fact, 81% percent of the people in the study said they wouldn’t want a vehicle that was programmed to be utilitarian. They would rather own a car that protected them and their family members at all costs — even if that meant killing the 10 pedestrians.
“Most people want to live in in a world where cars will minimize casualties…But everybody wants their own car to protect them at all costs.” — Iyad Rahwan, associate professor in the MIT Media Lab
And there’s the dilemma and one of the moral implications of programming Autonomous Vehicles.
Should the passengers be able to make the choice? Should the vehicle itself, programmed from the factory, without the ability to change?
Liability questions abound
What happens if manufacturers don’t give passengers the ability to program the vehicle as they see fit and the worst-case scenario happens as the car plows into the pedestrians and kills them. The manufacturer then has to face liability lawsuits because they made the decision and not the passenger.
On the flip side, if the passenger can make that choice, then the liability likely rests with them.
Will the government give anybody the choice or mandate how Autonomous Vehicles are programmed… and does the government than have the liability?
More questions than answers right now
As we know, laws always trail technology. We’re still dealing with legal issues from technology built decades ago. In this case, it likely means government regulation and a bunch of lawsuits.
“We will work with state partners toward creating a consistent national policy on these innovations, provide options now and into the future for manufacturers seeking to deploy autonomous vehicles, and keep our safety mission paramount at every stage.” — NHTSA Administrator Mark Rosekind.
Government regulations could decide fate of Autonomous Vehicles
So, government regulations could be the single biggest hurdle for adoption of autonomous cars, outweighing the technology. The programming is relatively easy compared to the moral decisions that have to be made. That’s the real challenge. People may not want the technology that they themselves admit will make the most moral decision…if it negatively impacts them.
“[We were] even more surprised that so many people would renounce buying a driverless car if there was a regulation in place that would force them to buy the self-sacrificing cars that they morally approved of.” — Jean-Francois Bonnefon to CNN
That could mean that programming the cars to make the right moral decision may be one of the biggest impediments to adoption.
Autonomous Vehicle adoption could save lives
Will Autonomous Vehicles make better decisions that humans in that split second between life and death? Most people think they will.
We know the cost of not implementing the technology. 32,675 died on America’s roadways in 2014. The numbers in 2015 will be even higher, said NHTSA’s Rosekind. “That’s how many families lost a loved one, how many friends we lost.”
“In the United States, we lose the equivalent of a fully loaded 747 crashing every single week due to roadway fatalities.” — NHTSA Administrator Mark Rosekind
Further reading: “The social dilemma of autonomous vehicles” published in the journal Science