Tesla Crash Shows Drivers Are Confused By “Autonomous” vs. “Autopilot”

The terms car companies use to distinguish between different driver technologies are confusing, at best.

5. 18. 18 by Leslie Nemo
Sandman Design / Emily Cho
Image by Sandman Design / Emily Cho

Last week, a Tesla Model S crashed into the back of a firetruck in Utah. Police records show the driver had turned on the car’s cruise control and autosteer about 80 seconds before the incident. On May 16, Tesla officials confirmed that yes, the driver had done both those things—and taken her hands completely off the wheel.

The incident ended in a broken ankle. Other Tesla autopilot accidents, however, have been fatal. The resulting question seems to be: Who exactly is at fault? Tesla seems to think it’s all a matter of semantics.

The company’s report on the Utah incident claims drivers are constantly reminded that “autopilot” doesn’t mean the same thing as “autonomous.” The mode is there to assist drivers, but not completely take over for them.

“The driver absolutely must remain vigilant with their eyes on the road, hands on the wheel and they must be prepared to take any and all action necessary to avoid hazards on the road,” Tesla’s report explains.


If overconfidence in autopilot is really somewhat at fault here, should autonomous car companies be doing more to educate drivers on the technology and terminology?

Since Google’s Waymo began dabbling in self-driving in 2009, the language surrounding “self-driving” has been pretty murky. Waymo’s history calls both of their 2012 and 2015 achievements “self-driving,” even though the first instance required someone to be in the driver’s seat and the second didn’t.

Tesla was the first to install “autopilot” in their cars, which forced other brands to follow, like Nissan’s ProPILOT assist and GM’s comprehensive safety features. And on top of the different names, none of the assistance tech does the same thing, as a test driver for The Verge found out.

Tesla’s own webpage about autopilot lists everything the service can do, like match nearby cars’ speeds, change lanes, and transition between freeways—which sounds, dare we say, pretty autonomous. If the capabilities of an autopilot system are much lower than those of a full autonomous car, but the risks much higher, shouldn’t product warnings make that exceedingly clear?


But perhaps Tesla’s warnings are already clear enough: Keep your hands on the wheel just in case. Maybe it all really does come down to user comprehension. In which case, do your part and read the manual. Maybe even look at the glossary. It could save your life.

Futurism Readers: Find out how much you could save by switching to solar power at UnderstandSolar.com. By signing up through this link, Futurism.com may receive a small commission.

Share This Article

Keep up.
Subscribe to our daily newsletter to keep in touch with the subjects shaping our future.
I understand and agree that registration on or use of this site constitutes agreement to its User Agreement and Privacy Policy


Copyright ©, Camden Media Inc All Rights Reserved. See our User Agreement, Privacy Policy and Data Use Policy. The material on this site may not be reproduced, distributed, transmitted, cached or otherwise used, except with prior written permission of Futurism. Fonts by Typekit and Monotype.