To make an omelet, you have to crack a few eggs. Except in this case, the result isn't a delicious breakfast — we’re talking about self-driving cars crashing into people and the spaces they occupy. That'll likely happen for a while as we work towards a future in which autonomous vehicles are safer than human drivers, and accidents no longer occur.

This is the argument of Mark Rosekind, a former administrator of the National Highway Traffic Safety Administration during the Obama Administration.

“Unfortunately, there will be crashes. People are going to get hurt and there will be some lives lost,” Rosekind told BBC. “All of that I think is going to be, I hope, focused on the service of trying to save lives,” he said, adding that a vast majority of fatal accidents today are due to human error, and that current risks of letting imperfect self-driving cars onto the road are worth the chance of a future without car crashes at all.

It’s like Rosekind was faced with the trolley problem, pulled the lever, then said “why stop there?”

But even so, he has a point. For now, almost 3,300 people around the world are killed in car crashes every day, according to the Association for Safe International Road Travel. That comes out to 1.3 million deaths every year, 37,000 of which are in the U.S.

So, we know car crashes are already a major cause of death and injury around the world. Self-driving cars could help prevent some of those tragedies by keeping people from making fatal errors — it's one of their biggest selling points. In order to get self-driving cars to that point, though, we're going to have to test them. And it’s not realistic to assume that all of them will go well.

That does not mean, however, that it would not be a tragedy for anyone to be hurt or killed by an autonomous vehicle along the way, and accepting these risks as a society should not be an excuse to gloss over any loss of life. Companies that develop self-driving cars must be held accountable for their mistakes and failures. Case in point: the fatal accident in March in which an autonomous Uber car failed to brake before hitting a pedestrian.

Drivers also don't understand the difference between autopilot and autonomous mode, which some cars like Teslas offer. Better-informed drivers could prevent even more accidents and dangerous misunderstandings.

Basically, we can’t just shrug our shoulders and say “Well, these cars are going to kill people but it might be for the greater good.” Not all of these incidents are inevitable. Lawmakers and technology companies need to establish clearly-defined rules and repercussions for self-driving car manufacturers when things go wrong. Even if autonomous cars bring us towards a safer world, these experiments will affect everyone on the road, and building public trust will be crucial for getting the technology to where it needs to be.


Share This Article