It’s Shockingly Easy to Drive a Tesla Without Anybody in the Driver’s Seat
This is some seriously damning evidence.
The news comes after the latest high profile Tesla crash, which left two dead this past weekend in Texas.
That crash involved a 2019 Tesla Model S — not the Model Y that was used during Consumer Reports‘ testing — which crashed into a tree at high speed after reportedly not making an expected turn.
Investigators found clear evidence that nobody was sitting in the driver’s seat in the moments leading up to the crash. That left a huge question: were either Autopilot or Full Self-Driving, the carmaker’s self-driving options, turned on at any point?
The National Highway Traffic Safety Administration and the National Transportation Safety Board are now investigating the incident.
While testing the Autopilot feature on a closed track, engineers at Consumer Reports found that the feature allowed them to drive down a track even with nobody in the driver’s seat, just as appears to have been the case in Saturday’s crash.
“In our test, the system not only failed to make sure the driver was paying attention — it couldn’t even tell if there was a driver there at all,” Consumer Reports‘ senior director of auto testing Jake Fisher said in the report.
“Over several trips across our half-mile closed test track, our Model Y automatically steered along painted lane lines, but the system did not send out a warning or indicate in any way that the driver’s seat was empty,” the engineers wrote.
All it took was to buckle the seat belt of the driver’s seat, along with a small weighted chain, placed on the steering wheel, that fooled the Autopilot system into thinking the driver’s hand was on the wheel.
“The car drove up and down the half-mile lane of our track, repeatedly, never noting that no one was in the driver’s seat, never noting that there was no one touching the steering wheel, never noting there was no weight on the seat,” Fisher said. “It was a bit frightening when we realized how easy it was to defeat the safeguards, which we proved were clearly insufficient.”
“Autopilot makes mistakes, and when it encounters a situation that it cannot negotiate, it can immediately shut itself off,” Fisher said. “If the driver isn’t ready to react quickly, it can end in a crash.”
Tesla has repeatedly faced criticisms for its laissez-faire approach to advertising its self-driving tech. For instance, despite its name, the additional “Full Self-Driving” feature offered by the company (which reportedly was not purchased for the vehicle involved in Saturday’s crash) doesn’t actually do what the name suggests.
“Truly self-driving cars don’t yet exist for consumers to buy,” as Consumer Reports‘ analysis found.
Other competing systems, such as General Motors’ Super Cruise system, use infrared cameras to ensure the driver is looking at the road. Autopilot, however, only examines steering wheel inputs.
Fisher suggests Tesla could use existing weight sensors used for seat belt warnings and airbags to make sure somebody is actually driving the vehicle.
Worse yet, Tesla is actively using its customer base to test out the latest self-driving technologies, something Ford CEO Jim Farley used recently to rake the Musk-led company over the coals with.
“They have changed the EV market and made the idea of owning an EV far more attractive than ever before,” Fisher said.
“But they seem to be using their customers as development engineers as they work on self-driving technologies, and they need to do a better job of keeping them safe,” he added.
As a Futurism reader, we invite you join the Singularity Global Community, our parent company’s forum to discuss futuristic science & technology with like-minded people from all over the world. It’s free to join, sign up now!