So much for Tesla having the most advanced self-driving technology in the world.
A panel of experts assembled by The Washington Post have reviewed and verified footage of Teslas equipped with the company's beta Full Self-Diving (FSD) malfunctioning in terrifying ways. Unsurprisingly, they found that the driver assistance tech is likely doing more harm than good.
In early February 2022, Tesla was forced to issue software updates to almost 54,000 of its cars that feature the company's signature FSD program due to the cars' propensity to not completely stop at stop signs. An increasing number of videos posted by Tesla drivers of their self-driving cars going haywire, though, seem to show that the issues with FSD go far beyond rolling stops.
After reviewing six such videos, WaPo's panel of experts suggested that FSD may be too dangerous to use on public roads. In one of the more alarming analyses, the six experts — whose backgrounds include self-driving research, autonomous vehicle safety analysis and self-driving development — pointed out that one video of a Tesla not slowing down enough when a pedestrian crosses light rail tracks shows the Tesla FSD's difficulty recognizing pedestrian walk signs and doesn't seem able to understand that pedestrians may walk off sidewalks and into the roadway.
Another of the more upsetting videos reviewed by the WaPo's panel details how the FSD prompted its driver to take over control of the vehicle when becoming confused by a truck partially blocking the street, but when the driver does so, they struggle to actually get control back from the driver assistance program and has to sharply turn the wheel repeatedly to do so.
"I am taking over,” the driver says in the vehicle while jerking the wheel. "I’m — I’m trying."
"It’s unclear who exactly is in control at that moment," Andrew Maynard, an Arizona State University professor who works in the school's Risk Innovation Lab, told WaPo. "There’s an odd glitch here where there seems to be a short fight for control between the driver and the car. It appears there are scenarios where both driver and car potentially lose control at some points."
The glitch revealed in that video, Maynard added, is "important" because it demonstrates potential difficulties in "the ability of the human driver to ensure safety" in case of such malfunction.
Here's a typical drive on Tesla FSD Beta, and why it's the opposite of useful. It's most certainly not even close to being "safer than a human", by any factor.
22-minute drive, 4.5 miles, and WAY too many interventions. pic.twitter.com/oKmkRZ3Tgh
— Taylor Ogan (@TaylorOgan) February 1, 2022
Along with the experts' analysis about the dangers posed by Tesla's nascent FSD, WaPo also spoke to the drivers behind the videos to confirm their veracity — and one, identified only as Chris from Fenton, Michigan, said that after using his Tesla's driver assistance program for about a year, he thinks it'll be another decade before the program is truly be ready to be taken on the road.
Expert criticism of Tesla is almost a cottage industry at this point, but for a Tesla owner himself to admit that the manufacturer's self-driving mode isn't ready for public consumption is striking.
READ MORE: ‘Full Self-Driving’ clips show owners of Teslas fighting for control, and experts see deep flaws [The Washington Post]