Driverless cars: Who should die in a crash?

  • Published
A driverless carImage source, Getty Images

If forced to choose, who should a self-driving car kill in an unavoidable crash?

Should the passengers in the vehicle be sacrificed to save pedestrians? Or should a pedestrian be killed to save a family of four in the vehicle?

To get closer to an answer - if that were ever possible - researchers from the MIT Media Lab have analysed more than 40 million responses to an experiment they launched in 2014.

Their Moral Machine has revealed how attitudes differ across the world.

How did the experiment work?

Weighing up whom a self-driving car should kill is a modern twist on an old ethical dilemma known as the trolley problem.

The idea was explored in an episode of the NBC series The Good Place, in which ethics professor Chidi is put in control of a runaway tram.

Image source, NBC/The Good Place
Image caption,
In The Good Place, Chidi must make a moral decision

If he takes no action, the tram will run over five engineers working on the tracks ahead.

If he diverts the tram on to a different track he will save the five engineers, but the tram will hit one other engineer who would otherwise have survived.

The Moral Machine presented several variations of this dilemma involving a self-driving car.

Image source, MIT Media Lab
Image caption,
Moral Machine: Should a self-driving car save passengers or pedestrians?

People were presented with several scenarios. Should a self-driving car sacrifice its passengers or swerve to hit:

  • a successful business person?
  • a known criminal?
  • a group of elderly people?
  • a herd of cows?
  • pedestrians who were crossing the road when they were told to wait?

Four years after launching the experiment, the researchers have published an analysis of the data in Nature magazine.

What did they find?

The results from 40 million decisions suggested people preferred to save humans rather than animals, spare as many lives as possible, and tended to save young over elderly people.

There were also smaller trends of saving females over males, saving those of higher status over poorer people, and saving pedestrians rather than passengers.

About 490,000 people also completed a demographic survey including their age, gender and religious views. The researchers said these qualities did not have a "sizeable impact" on the decisions people made.

The researchers did find some cultural differences in the decisions people made. People in France were most likely to weigh up the number of people who would be killed, while those in Japan placed the least emphasis on this.

The researchers acknowledge that their online game was not a controlled study and that it "could not do justice to all of the complexity of autonomous vehicle dilemmas".

However, they hope the Moral Machine will spark a "global conversation" about the moral decisions self-driving vehicles will have to make.

"Never in the history of humanity have we allowed a machine to autonomously decide who should live and who should die, in a fraction of a second, without real-time supervision. We are going to cross that bridge any time now," the team said in its analysis.

"Before we allow our cars to make ethical decisions, we need to have a global conversation to express our preferences to the companies that will design moral algorithms, and to the policymakers that will regulate them."

This Twitter post cannot be displayed in your browser. Please enable Javascript or try a different browser.View original content on Twitter
The BBC is not responsible for the content of external sites.
Skip twitter post by Spencer Kelly

Allow Twitter content?

This article contains content provided by Twitter. We ask for your permission before anything is loaded, as they may be using cookies and other technologies. You may want to read Twitter’s cookie policy, external and privacy policy, external before accepting. To view this content choose ‘accept and continue’.

The BBC is not responsible for the content of external sites.
End of twitter post by Spencer Kelly

Germany has already introduced a law that states driverless cars must avoid injury or death at all cost.

The law says algorithms must never decide what to do based on the age, gender or health of the passengers or pedestrians.