Regulators want to know why Autopilot seems to have it out for emergency vehicles.
The federal government's investigation into crashes seemingly caused by Tesla's semi-autonomous "Autopilot" driving assistance software is ramping up.
On Tuesday morning, the National Highway Traffic Safety Administration (NHTSA) sent Tesla an 11-page letter demanding data on exactly how the Autopilot system detects and perceives emergency vehicles and other hallmarks of a crash scene, including flashing lights, flares, and reflective vests, the Associated Press reports. The investigation, which covers 765,000 Tesla Model S, 3, X, and Ys from between 2014 and 2021, could be a turning point in the automaker's occasionally-combative relationship with the government — in which either Tesla cooperates or risks paying huge fines and being found liable for multiple crashes.
The NHTSA formally launched its investigation into Autopilot's tendency to hit emergency vehicles pulled over on the side of the road in mid-August, based on 11 crashes that have happened since 2018. However, the AP reports that it had to add a 12th to the list after a Tesla with Autopilot crashed into a Florida Highway Patrol car on a highway near Orlando on Saturday, killing one person and injuring another 17.
This comes after years of regulators grumbling about Autopilot and Tesla's related "Full Self-Driving" software, alongside several other instances of the NHTSA either considering or launching probes into Tesla's various glitches.
In this case, Tesla has until October 22 to respond to the NHTSA's letter and send over all of the data that the agency requested.
If it fails or refuses to do so — CEO Elon Musk has a tendency to ignore government agencies, after all — the company could be fined over $114 million.
READ MORE: US asks Tesla how Autopilot responds to emergency vehicles [AP]
More on the NHTSA investigation: US Gov Investigating Tesla Autopilot for Crashing Into Emergency Vehicles