Earlier this month, YouTuber and former NASA engineer Mark Rober released a video titled "Can You Fool a Self Driving Car?" which showed a Tesla on Autopilot getting fooled by a Wile E. Coyote-style wall painted to look like the road ahead of it.
High frame-rate footage shows the vehicle plowing right through the styrofoam obstruction, an eyebrow-raising demonstration of the company relying exclusively on video feeds for its driver assistance software. That's unlike some of its competitors in the autonomous vehicle space, which use LIDAR and radar as well — technology that can easily tell between the road and a wall that only looks like it.
The controversial video kicked off a heated debate surrounding Tesla's driver assistance tech, with fanboys crying foul, arguing Rober should've used the company's more sophisticated — and very expensive — Full Self-Driving (FSD) software.
But given a new attempt at a replication of the experiment, there are still plenty of reasons to be concerned about Tesla relying exclusively on cameras — and perhaps a glimmer of hope for the EV maker, as well.
Over the weekend, YouTuber Kyle Paul shared his own response video, showing that a Model Y with a previous generation HW3 computer will still plow through a wall painted like the road ahead — even with the FSD feature turned on.
"See the wall, does not see the wall," Paul noted after slamming the brakes and slowly inching toward the wall, which was similarly painted to look like the horizon. "Starting to see its own shadow on the wall, and if I get really close, about touching it, it sees the wall."
"With no doubt, the Model Y would have gone through the wall," he concluded. "I had to break full force because I saw that the camera, the visualization, was not seeing the wall."
However, that's not the full story, as a Cybertruck with the latest-generation HW4 computer and camera system handily detected the same wall and came to a full stop.
"Sees the wall," Paul said as the truck, which was running last month's FSD version 13.2.8, came to a halt. "Stops."
"At no point did I feel like it was going to hit the wall," he concluded.
Given the mixed results, it's hard to draw any definitive conclusions. Besides, running into a Wile E. Coyote-style wall isn't exactly something Tesla drivers encounter on a day-to-day basis.
But there are still potentially hundreds of thousands of vehicles still outfitted with the now out-of-date HW3 computer, which may suffer significantly from Tesla's decision to rely entirely on camera sensors.
Musk has previously promised to provide a free upgrade to HW4 for those customers — but whether he'll hold himself to that remains to be seen, given his reputation.
The promise also highlights that Musk himself is worried that the vast majority of Tesla vehicles currently on the road won't be able to actually drive themselves after all.
Paul also didn't replicate Rober's tests of having the Tesla drive through heavy simulated fog and rain — which are arguably far more relevant conditions for drivers who live in the real world, rather than a Looney Tunes universe.
And the Tesla did plow through a mannequin of a child in both tests, indicating the EV maker has a lot of work to do until it can fulfill Musk's decade-long promise of a safe, autonomous driving future.
More on Tesla: Wait, Why Is the Cybertruck Held Together With Glue?
Share This Article