"Autopilot just tried to kill me, so please fix it."

Death Race

Fred Lambert, a noted Tesla enthusiast and editor-in-chief of the electric vehicle blog Electrek, claims Full Self-Driving "tried to kill" him while he was behind the wheel of his Model 3. Had he not intervened, Lambert says, he would have crashed at highway speed.

At first, nothing was out of the ordinary during his drive to Montreal. Lambert engaged FSD and held a steady speed of 73 miles per hour. It was when the Tesla attempted to pass another car that all went to hell.

"I felt FSD Beta veering aggressively to the left toward the median strip," Lambert recalled.

Because he kept his hands on the steering wheel — a safety protocol that many Tesla drivers openly flaunt — he was able to "steer back toward the road."

"It was super scary as I almost lost control when correcting FSD Beta," Lambert said. "I was passing a vehicle. I could have crashed into it if I overcorrected."

Then — ill-advisedly — Lambert re-engaged FSD and replicated the error only moments later.

When he steered back into the left lane for a second time with FSD on, he wrote, the software once again "veered to the left toward the median strip" and then tried to blow through an unmarked U-turn intended only for emergency vehicles at full speed.

Luckily, Lambert was prepared and took control in time — but it sounds like the consequences could have been catastrophic.

Fatal Error

At any rate, Lambert says the malfunction echoes how Autopilot used to accidentally swerve into an exit ramp without warning — an error he thought had long been fixed.

"Tesla Autopilot used to try to take exit ramps it wasn't supposed to in the early days, but it was something that Tesla fixed a while ago," he said. "I haven't had that happen to me in years."

Later, he submitted a bug report to Tesla: "Autopilot just tried to kill me, so please fix it."

He seems to think this is a problem arising from FSD's latest updates, what he describes as a "new aggressive bug." That may be true, as Tesla has had to roll back crash-causing updates in the past, but maybe it's time to acknowledge a bitterer truth: this tech, in its current form, is far from being safe enough to roam public streets.

In other words, the world shouldn't be a playground for Tesla's beta, a sentiment that authorities are belatedly warming up to.

The US Justice Department began investigating Tesla last year for the dangerously misleading marketing of Autopilot, and California has similarly cracked down on Tesla using the term Full-Self Driving. Piling on the pressure, the National Highway Traffic Safety Administration has also recently stepped in to investigate an unsafe "Elon Mode" for Autopilot.

For now, Lambert has this advice: only use FSD "as intended," although you might want to consider not using it at all.

More on Tesla: Elon Musk Shows Off FSD, Forced to Intervene When It Tries to Drive Through Red Light

Share This Article