Elon Musk: Autopilot Wasn’t Engaged During Fatal Driverless Crash
The plot thickens.
Over the weekend, a Tesla Model S crashed into a tree, killing its two occupants. Mysteriously, investigators found with “100 percent” certainty that neither of them were in the driver’s seat.
“Several of our folks are reconstructionists, but they feel very confident just with the positioning of the bodies after the impact that there was no one driving that vehicle,” Harris County constable Mark Herman told local Houston news station KHOU.
The implication, clearly, was that the vehicle’s self-driving capabilities had been engaged at the time of the grisly wreck. Considering the number of reckless stunts and videos uploaded to social media showing motorists out of the driver’s seat while a Tesla was racing down a highway, it was perhaps the likeliest scenario.
But now, Tesla CEO Elon Musk has chimed in on Twitter to claim that “data logs recovered so far show Autopilot was not enabled and this car did not purchase FSD,” referring to an optional “Full Self-Driving” package, which doesn’t actually allow the vehicle to fully drive itself, but does enhance Tesla’s self-driving experience.
Musk also said that “standard Autopilot would require lane lines to turn on, which this street did not have.”
Musk’s explanation seemed carefully worded, though, prompting sharp and unanswered questions from close Tesla observers.
“But a Tesla owner just demonstrated today that Autopilot works without lane markings and goes over the speed limit in 25 mph roads,” New York Times auto reporter Neal Boudette replied to Musk’s comment. “What data exactly have you recovered from the car? You say AP was not enabled, do you mean at the time of the crash? What about the secs/mins before?”
Indeed, a video uploaded to YouTube by the account Dirty Tesla in 2019 shows Autopilot following the road without lane markings while traveling at 50 mph.
As Electrek points out, Tesla has a track record of skewing the data it presents following a crash.
“For example, the automaker has told the media that the driver’s hands weren’t detected on the steering wheel prior to a crash, but Tesla can’t detect hands on their steering wheel and can only detect torque being applied to the wheel,” Electrek editor-in-chief Fred Lambert wrote.
In other words, we should take Tesla’s handling of this PR crisis — which its CEO is covering because it fired its PR department — with a healthy grain of salt.
Meanwhile, federal investigators are probing the same questions. The National Transportation Safety Board (NTSB) has since announced that it is investigating the crash, tweeting that its investigation “will focus on the vehicle’s operation and the post-crash fire. NTSB investigators will arrive in the area later this afternoon.”
There’s no doubt Teslas are overall fairly safe vehicles, at least compared to the rest of the market. The company prides itself on making some of the safest vehicles on the road.
But we have seen a continuous stream of incidents, many involving Autopilot, that have resulted in needless deaths. Now, it’s using the public as beta testers for its latest version of its Full Self-Driving feature.
Maybe it’s time for the company, and Elon Musk, to rework its approach to self-driving technologies — or at least its communications strategy.
More on Autopilot: Two Die in Fiery Tesla Wreck, Seemingly in Self-Driving Mode
As a Futurism reader, we invite you join the Singularity Global Community, our parent company’s forum to discuss futuristic science & technology with like-minded people from all over the world. It’s free to join, sign up now!