A Tesla employee — and "devoted" fan of its CEO Elon Musk — named Hans von Ohain was killed after his Model 3 crashed into a tree and erupted in flames back in 2022.

His friend and fellow passenger Erik Rossiter, who survived the collision, has since told The Washington Post that von Ohain had the car's Full Self-Driving feature turned on at the time of the fatal accident.

If confirmed, the crash could be the first known death involving the feature, an optional $15,000 add-on that has already drawn plenty of attention from regulators.

WaPo also confirmed the vehicle was equipped with the feature — von Ohain received it for free with his employee discount — and his widow also said he frequently made use of it.

Despite the company's misleading marketing, Tesla vehicles are still far from being able to drive themselves, and cautions on its website that drivers must be "ready to take immediate action including braking."

Full Self-Driving (FSD) expands on the company's Autopilot driver assistance software, and is designed to make decisions on behalf of the user while driving on both highways and busy city streets.

While the National Highway Traffic Safety Administration is already investigating Autopilot following a series of accidents in which Teslas have smashed into emergency response vehicles that were pulled over with sirens or flares, no fatal crash has been definitively linked to FSD — though von Ohain's death, as WaPo reports, sure looks suspicious.

According to a Washington Post analysis last year, the number of fatal crashes involving Autopilot mode has surged. Out of over 700 crashes involving the feature since 2019, at least 17 were fatal.

An autopsy of von Ohain's body found that he died with a blood alcohol level of 0.26, which is over three times the legal limit.

Nonetheless, even if alcohol wasn't involved, experts have pointed out that Tesla's misleading marketing may be giving drivers a false sense of security behind the wheel.

In spite of Tesla's outsize claims about the tech, Tesla holds that its FSD feature is still in "beta," meaning that it's still actively being developed.

Its decisionmaking on the road can be highly suspect. In a video uploaded just last week, a Tesla owner had to override the feature after it attempted to make an abrupt left turn, steering straight into oncoming traffic.

We've also come across videos of Teslas with the feature turned on ignoring red lights, smashing into a police car, and struggling in snowy conditions.

Von Ohain's death raises important ethical questions, especially when it comes to culpability. Is Tesla's misleading marketing to blame, or was it a case of an driver's reckless actions?

Earlier this week, the lawyer of a Tesla driver who at first denied having killed a woman with his Tesla in a hit-and-run, is now saying that he couldn't remember if he did — and if he did, he must've been "using Tesla's full self-driving capability."

"Regardless of how drunk Hans was, Musk has claimed that this car can drive itself and is essentially better than a human," von Ohain's widow Nora Bass told WaPo. "We were sold a false sense of security."

Bass told the newspaper that personally, she found the feature too unpredictable and "jerky."

A lot is riding on the feature, with Musk arguing in 2022 that FSD is "the difference between Tesla being worth a lot of money and being worth basically zero."

Even with its full weight behind developing the feature, the software is still a long way from realizing its stated goal of full autonomy.

Musk has repeatedly promised that Tesla is going to achieve Level 5 autonomy in a matter of less than a year, a point at which a car is fully autonomous and doesn't need a steering wheel or brake pedal.

However, today the feature still hasn't surpassed Level 2 autonomy, requiring the driver to take over at any time.

According to WaPo, Tesla has yet to publicly acknowledge von Ohain's death.

More on FSD: Tesla Driver Says He's Not Sure If He Killed a Pedestrian Because He Was on Autopilot


Share This Article