Following almost a decade of broken promises and misleading marketing, Tesla's Full Self-Driving is still a long way from being able to actually get you to work in your Tesla — at least without forcing you to intervene to not get killed.

Case in point: CEO Elon Musk took to X-formerly-Twitter to livestream a joyride using an unreleased version of FSD. But as it turns out, the "mind-blowing" software caused Musk to lurch into an intersection and even attempted to run a red light, as Vice reports.

"Ohhh, intervention!" Musk exclaimed during the stream. "Sorry," he added, laughing. "That’s why we’ve not released this to the public yet."

The version he was showing off, dubbed v12, relies on neural nets to identify things like traffic lights, pedestrians and other vehicles. Earlier this year, Musk promised that "v12 is reserved for when FSD is end-to-end AI, from images in to steering, brakes and acceleration out."

In other words, Musk doesn't believe in radar, a technology that has become standard for Tesla's competitors, and thinks AI will be able to make do using only the vehicle's cameras.

But, as many have pointed out, that also makes the tech far more unpredictable and difficult to diagnose when things go wrong.

Musk has also promised that v12 will be the first iteration of the software that technically won't be called a "beta," suggesting we should expect even loftier promises about its purported capabilities.

Following Musk's latest close call, the mercurial CEO devised a simple solution: train these deep learning models with even more data.

"So with that intervention we just had, the solution is essentially to feed the network a bunch more video of traffic lights," Musk promised during the stream. "That was a controlled left turn where there was a green light for the left turn but not a greenlight to go straight."

In other words, Musk is kicking the can down the road by making even more promises about future iterations of the software, which was first released in beta form to a selection of drivers in October 2020.

"And, so," he said, "we’ll feed it a bunch of video of controlled left turns and then it’ll work."

But should we really be taking him by his word at this point? Since at least 2014, Musk has been making big promises about Teslas becoming fully autonomous.

Yet many glaring issues remain, from vehicles plowing into motorcycles or median barriers on the highway to losing traction completely in snowy conditions and colliding with stationary emergency vehicles — the latter of which triggered a formal investigation by the US National Highway Traffic Safety Administration.

It's not just the NHTSA. Even the Justice Department is now investigating the carmaker over misleading marketing of its driver assistance system, called Autopilot.

And that's not to mention the substantial number of lawsuits, including a class-action complaint, Tesla is facing over collisions and fatalities that reportedly involved the software.

Almost a decade on, FSD is still a liability on public roads, as Musk has demonstrated with his own stream.

Sure, as his devoted supporters on his social media platform were quick to point out, the rest of Musk's joyride went by without any other glaring issues.

But running a red light or proceeding through intersections at the wrong time alone can make the difference between life and death, so that's pale comfort to drivers.

More on FSD: Drivers Are Using Weights to Fool Their Teslas Into Thinking They're Paying Attention


Share This Article