by Leslie Nemo May 18, 2018 Advanced Transport
Leslie Nemo

Last week, a Tesla Model S crashed into the back of a firetruck in Utah. Police records show the driver had turned on the car’s cruise control and autosteer about 80 seconds before the incident. On May 16, Tesla officials confirmed that yes, the driver had done both those things—and taken her hands completely off the wheel.

The incident ended in a broken ankle. Other Tesla autopilot accidents, however, have been fatal. The resulting question seems to be: Who exactly is at fault? Tesla seems to think it’s all a matter of semantics.

The company’s report on the Utah incident claims drivers are constantly reminded that “autopilot” doesn’t mean the same thing as “autonomous.” The mode is there to assist drivers, but not completely take over for them.

“The driver absolutely must remain vigilant with their eyes on the road, hands on the wheel and they must be prepared to take any and all action necessary to avoid hazards on the road,” Tesla’s report explains.

If overconfidence in autopilot is really somewhat at fault here, should autonomous car companies be doing more to educate drivers on the technology and terminology?

Since Google’s Waymo began dabbling in self-driving in 2009, the language surrounding “self-driving” has been pretty murky. Waymo’s history calls both of their 2012 and 2015 achievements “self-driving,” even though the first instance required someone to be in the driver’s seat and the second didn’t.

Tesla was the first to install “autopilot” in their cars, which forced other brands to follow, like Nissan’s ProPILOT assist and GM’s comprehensive safety features. And on top of the different names, none of the assistance tech does the same thing, as a test driver for The Verge found out.

Tesla’s own webpage about autopilot lists everything the service can do, like match nearby cars’ speeds, change lanes, and transition between freeways—which sounds, dare we say, pretty autonomous. If the capabilities of an autopilot system are much lower than those of a full autonomous car, but the risks much higher, shouldn’t product warnings make that exceedingly clear?

But perhaps Tesla’s warnings are already clear enough: Keep your hands on the wheel just in case. Maybe it all really does come down to user comprehension. In which case, do your part and read the manual. Maybe even look at the glossary. It could save your life.