Driverless cars pose social dilemma: Study
Washington, June 27: Driverless cars may pose a social dilemma with most people wanting to live in a world where autonomous vehicles minimise casualties, but at the same time protect them at all costs, a new study has found.
Autonomous vehicles are programmed with a set of safety rules, and it is not hard to build a scenario in which such rules come into conflict with each other, researchers said.
“Suppose a driverless car must either hit a pedestrian or swerve in such a way that it crashes and harms its passengers. What should it be instructed to do?” they said.
A group of scientists, including a researcher from Massachusetts Institute of Technology (MIT) in the US, found that the public is conflicted over such scenarios, taking a notably inconsistent approach to the safety of autonomous vehicles, should they become a reality on the roads.
In a series of surveys, researchers found that people generally take a utilitarian approach to safety ethics – they would prefer autonomous vehicles to minimise casualties in situations of extreme danger.
That would mean, say, having a car with one rider swerve off the road and crash to avoid a crowd of 10 pedestrians. At the same time, the survey’s respondents said they would be much less likely to use a vehicle programmed that way.
Essentially, people want driverless cars that are as pedestrian-friendly as possible – except for the vehicles they would be riding in, researchers said.
“Most people want to live in in a world where cars will minimise casualties. But everybody want their own car to protect them at all costs,” said Iyad Rahwan from MIT.
The result is what researchers call a “social dilemma,” in which people could end up making conditions less safe for everyone by acting in their own self-interest.
“If everybody does that, then we would end up in a tragedy, whereby the cars will not minimise casualties,” said Rahwan.
The results consistently showed that people will take a utilitarian approach to the ethics of autonomous vehicles, one emphasising the sheer number of lives that could be saved.
For instance, 76 per cent of respondents believe it is more moral for an autonomous vehicle, should such a circumstance arise, to sacrifice one passenger rather than 10 pedestrians, researchers said.
But the surveys also showed a lack of enthusiasm for buying or using a driverless car programmed to avoid pedestrians at the expense of its own passengers, they said.
One question asked respondents to rate the morality of an autonomous vehicle programmed to crash and kill its own passenger to save 10 pedestrians; the rating dropped by a third when respondents considered the possibility of riding in such a car.
Similarly, people were strongly opposed to the idea of the government regulating driverless cars to ensure they would be programmed with utilitarian principles, researchers said.
The findings were published in the journal Science.