As self-driving vehicles begin testing around the world, driverless cars are on the verge of transforming everything from the way roads look to how fast vehicles travel. Transportation and urban design experts anticipate fewer parking lots, more green spaces, and a remaking of streets. Recently, news that self-driving Mercedes will be programmed to save drivers, not people they hit, has reignited an ethical debate over the morality of programming a car’s algorithms. If a driverless car is about to hit a pedestrian or cyclist, who’s life should be prioritized: the car’s occupants or the people they hit?
Car and Driver sparked the latest conversation with an article that stated that Mercedes-Benz would program its self-driving cars to save the people inside the car—every time. The October 7 piece quotes Chistoph von Hugo, the automaker’s manager of driver assistance systems and active safety. According to C and D, Hugo said in an interview, "If you know you can save at least one person, at least save that one. Save the one in the car. If all you know for sure is that one death can be prevented, then that’s your first priority."
Hugo’s comments resulted in a flurry of attention, especially because they address a modern day version of the infamous Trolley Problem. This thought experiment asks a person which is preferable when someone stands at a railway switch, watching a train run out of control: let five people die and do nothing? Or switch the trolley to a side track and sacrifice the life of one person in order to save the five?
One of the main selling points of the government’s newly released policy proposals on driverless cars is that they will make roads safer, especially because 94 percent of 2015’s 35,200 road fatalities were due to human error or choice. But how those same cars will deal with pedestrians is less clear. At some point, a driverless car will have to choose between striking pedestrians in order to keep its passengers safe or sacrificing its occupants for the sake of the greater good.
After the outcry against Christoph Hugo’s comments to Car and Driver, Mercedes has since retreated from the position that its driverless vehicles would prioritize the lives of car occupants. In an email to Jalopnik—another publication that reported on the story—Mercedes said the C and D quote was incorrect, and Hugo’s comments do not reflect Mercedes’ stance.
Officially, Mercedes’ skirts the Trolley Problem, saying that neither programmers nor automated systems should weigh the value of human lives. Instead, the company focuses on avoiding all accidents and wants to provide "the highest possible level of safety for all road users." Mercedes also told Jalopnik that making a decision in favor of one person against another is illegal in Germany, and "to clarify these issues of law and ethics in the long term will require broad international discourse."
Even as Mercedes walks back Hugo’s comments, the debate over who is prioritized in driverless car accidents is full of moral dilemmas. The fact that self-driving cars may save lives in the long run could be overlooked if expensive car makers prioritize the lives of those who are able to afford their product. At the same time, who would buy a car that would choose to save others if it meant putting their own life at risk?