YouTuber and former NASA engineer Mark Rober has kicked the hornet's nest with his latest video.

In the piece — titled "Can You Fool a Self Driving Car?" — Rober found that a Tesla car on Autopilot was fooled by a Wile E. Coyote-style wall painted to look like the road ahead of it, with the electric vehicle plowing right through it instead of stopping.

The footage was damning enough, with slow-motion clips showing the car not only crashing through the styrofoam wall but also a mannequin of a child. The Tesla was also fooled by simulated rain and fog.

A separate test vehicle, a Luminar tech-equipped Lexus SUV, aced the tests.

The stunt was meant to demonstrate the shortcomings of relying entirely on cameras — rather than the LIDAR and radar systems used by brands and autonomous vehicle makers other than Tesla.

"I can definitively say for the first time in the history of the world, Tesla's optical camera system would absolutely smash through a fake wall without even a slight tap on the brakes," Rober said in the video.

But Tesla's fanboys have since cried foul, arguing that the EV maker could even sue Rober for "false advertising/misleading an audience," according to YouTuber Kevin "Meet Kevin" Paffrath.

In a response video posted to Tesla CEO Elon Musk's X-formerly-Twitter, Paffrath argued that Rober had disengaged Autopilot right before crashing into the fake wall.

Paffrath went as far as to allege that Rober was being paid by Luminar, the LIDAR tech company that outfitted the SUV that went head-to-head with the Tesla.

Other users on X argued that Rober should've used Tesla's infamous Full Self-Driving (FSD) feature, which costs a whopping $8,000 on top of the cost of the vehicle.

In a separate post seemingly responding to the allegations, Rober shared the "raw footage of my Tesla going through the wall."

"Not sure why it disengages 17 frames before hitting the wall but my feet weren’t touching the brake or gas," he added.

In other words, is this really a smoking gun — or did the Autopilot disengage by itself, sending the Tesla plowing right through the wall while under "human" control?

As Electrek points out, Autopilot has a well-documented tendency to disengage right before a crash. Regulators have previously found that the advanced driver assistance software shuts off a fraction of a second before making impact.

It's a highly questionable approach that has raised concerns over Tesla trying to evade guilt by automatically turning off any possibly incriminating driver assistance features before a crash.

Put simply, instead of taking down Rober's purportedly anti-Musk hit piece, Paffrath inadvertently highlighted Tesla's shady practices.

Tesla has already been extremely reluctant about handing over crash data, especially when it comes to its infamous "Full Self-Driving" feature, duking it out with the California Department of Motor Vehicles in 2023.

Tesla has also gone after crash data collected by the National Highway Traffic Safety Administration (NHTSA). The EV maker reportedly asked the regulator to redact information about whether its Autopilot or FSD software was used during every single documented crash since 2021.

In short, Rober's latest video still points out a glaring shortcoming when it comes to Tesla's safety features — no matter how unhappy it makes Tesla fanboys.

More on the video: Man Tests If Tesla Autopilot Will Crash Into Wall Painted to Look Like Road


Share This Article