There were far more reported crashes involving Teslas in Autopilot mode than previously reported, according to a new analysis of National Highway Traffic Safety Administration data by The Washington Post.

According to the analysis, there have been at least 736 crashes in the US that involved the EV maker's controversial driver assistance feature since 2019.

Out of those 736 accidents, at least 17 were fatal, a considerable increase from just three deaths reported in June 2022. The NHTSA found that there were 807 automation-related crashes over that period, which means Tesla accounted for over 90 percent of them.

In other words, while there are some important caveats — chiefly that we don't know how many of these accidents were caused by the software itself — it's clear that Autopilot is running into more trouble on the road than previously known.

"A significantly higher number certainly is a cause for concern," Philip Koopman, a Carnegie Mellon University professor, told the WaPo. "We need to understand if it’s due to actually worse crashes or if there’s some other factor such as a dramatically larger number of miles being driven with Autopilot on."

Tesla CEO Elon Musk has long hyped up his car company's self-driving efforts, especially when it comes to a public beta of its so-called Full Self-Driving (FSD) feature, a more advanced $15,000 add-on that has also been implicated in a number of accidents.

During the company's AI event last year, Musk argued that there's a "moral obligation to deploy [FSD] even though you’re going to get sued and blamed by a lot of people."

His reasoning: the software saves more people than the number of crashes it may cause.

"Because the people whose lives you saved don’t know that their lives were saved," he said at the time. "And the people who do occasionally die or get injured, they definitely know — or their state does."

The data, though, is muddy and complex. Are all Autopilot accidents recorded? How does the safety compare between highway and non-highway driving? And can the software lull drivers into a false sense of security, leaving them unprepared if they need to suddenly take over?

Regulators have long argued that the names of both Autopilot and Full Self-Driving are misleading, since each still requires drivers to pay attention and be able to intervene at any time.

While Tesla maintains that its vehicles are five times less likely to crash with FSD mode turned on, it's impossible to get a full picture of how the company arrived at this stat.

And regulators clearly aren't convinced, either. The NHTSA has been investigating a small number of these 736 crashes involving Autopilot.

"We're investing a lot of resources," the regulator's acting head Ann Carlson told Reuters back in January. "The resources require a lot of technical expertise, actually some legal novelty and so we're moving as quickly as we can, but we also want to be careful and make sure we have all the information we need."

In February, Tesla recalled more than 360,000 vehicles and put out a warning that a beta version of its FSD feature "may cause crashes."

Sure, the deaths that involved a Tesla on Autopilot still only account for a tiny fraction of the estimated 42,795 motor vehicle-related deaths in the US in 2022.

But given the data we've seen so far — let alone the footage that has been circulating online, which has repeatedly demonstrated glaring flaws still plaguing Tesla's FSD beta — there's certainly enough evidence to reevaluate Tesla's lofty claims when it comes to its driver assistance software.

More on Tesla: Tesla Driver Says Self-Driving Mode Crashed Her Car Into a Tree


Share This Article