It's not every day that Tesla decides to contact regulators about a serious issue with its self-driving software.

Late last month, Tesla reported an issue with its Full Self-Driving software to the National Highway Traffic Safety Administration (NHTSA), officially recalling nearly 12,000 affected vehicles.

The timing of the recall couldn't be worse for the company, as The Washington Post points out in a scathing new report, and could even represent a turning point in the strained relationship between Tesla and government regulators.

Back in August, the NHTSA announced it had opened a formal investigation into Tesla’s Autopilot assisted driving software. Then last month, the NHTSA wrote the company a letter, noting its concern over self-driving features being tested on public streets and warning Tesla that it needed to accompany any changes with appropriate recalls — a request Tesla appears to have taken to heart.

Tesla owners had started noticing changes being made to their cars' self-driving features, including one user, Kevin Smith, who noticed that emergency braking and forward collision warning functions were mysteriously turned off after the latest over-the-air update.

"Dear Elon Musk, are you in there crossing the streams? I didn’t change this brah," Smith wrote in a October 24 tweet. "This isn’t ok without any communication. Communication is not hard. I’m doing it now."

"He was screaming ‘Do not use it! Do not use it!'" Smith told WaPo, recalling another beta tester telling him not to use the software after the update. "'We are trying to wake up the folks at Tesla, trying to get the word to Tesla.'"

Other users noticed false crash warnings, and even their vehicles swerving into pedestrians.

Another tester remembered "a pretty dramatic slamming of the brakes" in an interview with the Post. "For that to trigger undesirably at high speeds is an incredibly dangerous event."

At first, Tesla acted like the situation was business as usual.

"Regression in some left turns at traffic lights found by internal QA in 10.3," Musk tweeted late last month, promising the update would be pushed to users a day later.

To Musk, potentially life-threatening situations caused by software that isn't quite ready yet are "to be expected with beta software," according to an October tweet. "It is impossible to test all hardware configs in all conditions with internal QA, hence public beta."

The CEO has also repeatedly claimed over the years that its software is safer than humans making the decisions behind the wheel — but he's also the first to admit that self-driving "is a hard problem," as he wrote in a July tweet.

For now, despite the warnings and increased scrutiny by regulators, Tesla is forging ahead with its Full Self-Driving beta, though the recall could signal changing tides inside the company.

The company noted in a October 29 update that it was aware of the issues, claiming it had "investigated the reports and took actions to mitigate any potential safety risk" in "a matter of hours."

Overall, the automaker seems to be increasingly aware that all eyes are on it. It's still not clear, though, what it might take for regulators to intervene and take half-baked self-driving software off public roads.

The recent incidents illustrate that one buggy over-the-air update could have disastrous consequences.

Musk has long been of the opinion that regulation stifles innovation. But is cutting-edge, self-driving innovation really worth risking lives over?

READ MORE: Tesla’s recent Full Self-Driving update made cars go haywire. It may be the excuse regulators needed. [The Washington Post]


Share This Article