Another sign that autonomous vehicles can't compare to human cognition.

Edge Case

A Tesla owner named Andy Weedman recently found himself confused by a strange glitch in his Model 3's Autopilot system: The car kept slamming the brakes in the middle of the same stretch of road.

Eventually, he figured it out: His car was registering a giant stop sign printed on a nearby billboard as a real traffic sign, and therefore deciding that the right course of action was the come to a halt in the middle of the road. It's a comical glitch, to be sure, but the edge case also illustrates the myriad ways that self-driving car software continues to make mistakes — sometimes with deadly results.

Global Chaos

Weedman posted a follow-up video in which his Tesla, with Autopilot enabled, spots the sign and drops from 35 miles per hour to a full stop in the middle of the road. Weedman remains optimistic that Tesla will figure it out and says that it will just take more training to figure out edge cases like this before self-driving technology is truly ready.

But while each individual edge case may be a unique, small-scale glitch, the totality of unusual scenarios a car might encounter represents a whole world of chaos that's too complex for even the most sophisticated algorithms to currently detangle, Jalopnik argues.

Defensive Driving

Maybe there's a way to train Teslas to ignore signs on billboards, but then that introduces the risk that the system will detect a false negative and ignore real traffic signs, according to Jalopnik's analysis, putting people in danger.

The fact of the matter is that there are tons of individual oddities out there capable of tripping up the self-driving vehicles we have today, and it will likely take more than patchwork updates and fixes to actually reach full autonomy.

READ MORE: This Billboard That Confuses Tesla Autopilot Is A Good Reminder Of Why Self-Driving Is Still A Long Way Off [Jalopnik]

More on Tesla: Two Die in Fiery Tesla Wreck, Seemingly in Self-Driving Mode


Share This Article