Another one bites the dust.
A Michigan woman was taken to the hospital after she says her Tesla self-crashed into a tree while in self-driving mode.
As local outlet MLive reports, the unidentified 41-year-old woman was driving on a state highway when she says she put her car into self-driving mode and, according to a police report, all hell broke loose.
The woman told deputies that just after she put the car into the assisted driving mode — the reporting didn't specify whether she was talking about Tesla's Autopilot or the more advanced Full Self-Driving — it veered right, struck a tree, and rolled over multiple times.
The driver's injuries were reportedly minor, but the accident once again highlights one of the greatest tensions on today's roads: advanced driving assist features which, while undeniably impressive, are nowhere near perfect.
While there's not a lot more reporting about this specific incident — including basic details like what model Tesla the woman was driving — we know from precedent that Autopilot has been linked to many crashes, some of which have resulted in death and federal investigation.
While the general narrative has been that even if Full Self-Driving is misleadingly named — in reality, it requires the driver's attention on the road at all times — it's safer than Autopilot. That could be starting to change, though, with a massive leak of internal Tesla documents this month revealing a cache of previously unknown safety complaints about the feauture.
What does it all add up to? A big mess. Tesla's CEO Elon Musk has been a huge booster of the company's self-driving efforts, and if the systems really could become safer than a human driver, there'd obviously be a compelling argument for them. As of now, though, it feels like we're stuck in an ambiguous in-between when it's not at all clear how safe the features really are — and everyone on the road is taking on that risk.
Share This Article