Semi-Autonomous Cars: Shared Control, Shared Responsibility?

People are more likely to blame human drivers than intelligent cars for accidents

October 31, 2019

In semi-automated vehicles, human and machine share the task of driving. But who is to blame if both drivers make mistakes that result in an accident? A research team from the Max Planck Institute for Human Development, the Massachusetts Institute of Technology (MIT), and the University of Exeter put this question to almost 5,000 people. Respondents apportioned more blame to human drivers than to intelligent cars. The results have been published in Nature Human Behaviour.

Autonomous cars that put humans in the passenger seat promise more efficient, safer, and more comfortable road travel. Car manufacturers and researchers worldwide are working to develop the Intelligent Mobility systems of the future. But before humans completely hand control of their cars over to intelligent machines, hybrid forms will emerge, where humans and machines drive together and make decisions together. In some models, the human is the primary driver and controls the vehicle, while the machine is the secondary driver, observing the traffic and intervening only in dangerous situations. In other models, the vehicle already does most of the driving, while requiring the human to constantly monitor the traffic and, if necessary, take control.

Manufacturers are already testing semi-automated vehicles in real traffic, and the first fatal accidents have been reported. “In our paper, we ask who will be perceived as bearing the blame in the event of an accident: the human or the machine. This has implications for the kinds of policies that should be called for to regulate these vehicles.” says Sydney Levine, researcher at the Media Lab of the Massachusetts Institute of Technology (MIT) and co-lead author of the study, together with Edmond Awad from University of Exeter. To explore public attitudes to how blame should be allocated in semi-autonomous car crashes, an international team of researchers from the Max Planck Institute for Human Development, MIT, and the University of Exeter created a number of scenarios in which semi-automated vehicles were involved in fatal accidents. The scenarios were presented to a total of 5,000 respondents.

In some scenarios, the secondary driver wrongly takes control of the vehicle, steers the vehicle off track, and causes a fatal accident. In these scenarios, respondents clearly blamed the secondary driver, whether it was a human or a machine. But what happens if both drivers make a mistake? What if a vehicle is heading directly at a pedestrian, but neither the primary driver nor the secondary driver—who is supposed to recognize and prevent dangerous situations—applies the brakes in time, and the pedestrian is killed? The researchers presented respondents with a number of different scenarios: In some, the human was the primary driver; in others, the secondary driver. In some, both drivers were human beings; in others, both were machines. The results showed that in scenarios where a human driver and machine shared control of the vehicle and both made a mistake, respondents were more likely to blame the human for the accident—regardless of whether they were the primary or secondary driver.

Why humans tend to blame other humans more than autonomous vehicles is an open question. In general, people tend to attribute causal responsibility for events to other people rather than to chance or the environment. It seems that the public does not yet see a vehicle controlled by artificial intelligence as an independent agent that can act and decide freely. Consequently, people tend to absolve machines of guilt.

“We need to understand how blame for automated car crashes is distributed—otherwise, the future development and acceptance of self-driving vehicles will be dogged by uncertainty. Our results suggest that the public and perhaps the courts would tend to absolve technology—and thus car manufacturers—of blame. In that case, top-down regulation of automated car safety might be needed to correct the public under-reaction to crashes in shared control cases,” says Iyad Rahwan, Director of the Center for Humans and Machines at the Max Planck Institute for Human Development.

Original Publication
Awad, E., Levine, S., Kleiman-Weiner, M., Dsouza, S., Tenenbaum, J. B., Shariff, A., … Rahwan, I. (2019). Drivers are blamed more than their automated cars when both make mistakes. Nature Human Behaviour. doi:10.1038/s41562-019-0762-8

Other Interesting Articles

Go to Editor View