In Brief
The National Transportation Safety Board has determined that human over reliance, rather than any kind of technical failure, was the probable cause of the first fatality involving Tesla's Autopilot system. Still, the company has come under fire for not safeguarding against situation.

Prolonged Disengagement

The National Transportation Safety Board (NTSB) has ruled that an “over reliance on vehicle automation” was a major factor in the first fatal crash of a car using Tesla’s Autopilot technology.

The Technologies That Power Self-Driving Cars [INFOGRAPHIC]
Click to View Full Infographic

The crash occurred in July 2016, and after more than a year of investigation, the board determined that the Autopilot system allowed for “prolonged disengagement from the driving task.” The NTSB’s press release on the subject noted that Autopilot’s design allows drivers to use it in a way that is inconsistent with official guidance and warnings issued by the company.

“In this crash, Tesla’s system worked as designed, but it was designed to perform limited tasks in a limited range of environments,” said chairman Robert L. Sumwalt, according to a report from the Associated Press. “Tesla allowed the driver to use the system outside of the environment for which it was designed.”

Tesla responded to the NTSB’s report with a statement: “We appreciate the NTSB’s analysis of last year’s tragic accident and we will evaluate their recommendations as we continue to evolve our technology.”

Human Error

Ultimately, this crash was caused by human negligence. The driver of the truck failed to yield the right of way to the Tesla driver, and the Tesla driver was unable to react to that quickly enough because he was inappropriately relying on Autopilot.

The Model S is considered to be a level two self-driving vehicle on a five-point scale, which carries the expectation that the driver will be ready to take the wheel in an emergency. However, during the 37 minutes and 30 seconds that the car’s cruise control and lane-keeping systems were active ahead of the crash, he only had his hands on the wheel for 25 seconds.

Additionally, the system is designed to be used on interstate highways, but the driver was on a divided highway at the time of the crash, and nothing in Autopilot’s design prevents that type of misuse.

“System safeguards, that should have prevented the Tesla’s driver from using the car’s automation system on certain roadways, were lacking, and the combined effects of human error and the lack of sufficient system safeguards resulted in a fatal collision that should not have happened,” said Sumwalt.

Any driver that doesn’t use Autopilot as intended could pose a risk to themselves and others, but even the NTSB acknowledges the potential benefits of self-driving cars, as long as the technology is incorporated responsibly.

“While automation in highway transportation has the potential to save tens of thousands of lives, until that potential is fully realized, people still need to safely drive their vehicles,” according to Sumwalt.