It's finally official.
Making It Official
The US National Highway Traffic Safety Administration (NHTSA) has opened a formal investigation into Tesla's Autopilot assisted driving software according to a posting on its website — an inflection point that could signal the introduction of self-driving regulations.
Explosively, the investigation appears to center on a series of accidents in which Teslas have smashed into emergency response vehicles that were pulled over with sirens or flares. But whether the investigation will lead to any new laws is anything but a guarantee.
Crashing on Autopilot
The regulator will "assess the technologies and methods used to monitor, assist, and enforce the driver's engagement with the dynamic driving task during Autopilot operation," according to the filing.
The news comes after several high-profile crashes, some of which ended in deaths, involving Teslas that were reportedly on Autopilot during the moments leading up to the crash.
Since 2018 alone, the NHTSA has recorded 11 crashes involving Teslas on Autopilot hitting service vehicles that had their sirens turned on or were using flares to warn oncoming drivers, according to the announcement.
The regulator is examining almost all cars that Tesla has sold in the US since 2014, totaling some 765,000 vehicles, according to the Associated Press.
While the National Transportation Safety Board (NTSB) — which is investigating some of the crashes as well — has no power to enforce its rules, it has recommended the NHTSA to require Tesla to improve its driver monitoring system in particular.
Tesla turned a blind eye to the recommendations last year, as The Verge reported at the time.
The company's Autopilot system can easily be fooled into thinking somebody is in the driver's seat and paying attention to the road, as several recent investigations have shown. As a result, the system has been exploited for daredevil videos posted online on countless occasions.
Whether the NHTSA's investigation will lead to action is anybody's guess. The Autopilot feature has been under scrutiny for a number of years, but regulatory bodies have arguably yet to introduce any laws that would direct Tesla's actions in any significant ways.
READ MORE: US government opens probe into Tesla Autopilot crashes with emergency vehicles [The Verge]
More on Autopilot: Tesla Autopilot Mistakes Moon for Yellow Traffic Light