Tesla CEO Elon Musk has taken to Twitter to come clean about self driving tech.
But instead of putting the spotlight on the tech's immense engineering challenges, Musk is pointing the finger at, well, humanity as a whole.
"A major part of real-world AI has to be solved to make unsupervised, generalized full self-driving work, as the entire road system is designed for biological neural nets with optical imagers," the mercurial CEO tweeted on Thursday.
Put in simple terms, he's saying, self-driving cars are hard because they share the road with unpredictable human brains and fallible human eyes.
In some ways, Gizmodo was quick to point out, that's an incredibly obvious conclusion. Sure, given the absence of us meat bags, self-driving cars would have an easy time staying in lines, making lane changes, and generally getting along with each other.
But the reality is that for the time being, the roadways are still going to be clogged up with flawed human motorists. In fact, as a fatal crash near Houston this month illustrated vividly, many of Tesla's customers are already under the impression we are living inside of Musk's futuristic vision.
First responders say that the crash, involving a 2019 Tesla Model S, occurred with neither of the two occupants sitting in the driver's seat.
But Tesla's vice president of vehicle engineering Lars Moravy's investigation ended with the opposite conclusion: a reportedly dented steering wheel was evidence somebody was in fact in the driver's seat.
While we may never get behind the truth of what exactly happened leading up to the crash, there have been several other instances in recent memory that demonstrate the terms "Autopilot" and particularly "Full Self-Driving" (FSD) — an optional $10,000 add-on that enhances the self-driving experience, but doesn't actually allow the cars to drive themselves — are giving drivers a dangerously false sense of security.
Several drivers have been caught sleeping at the wheel while the Autopilot system was engaged. One driver even tricked the car into registering somebody in the driver's seat — something that's still trivially easy, as Consumer Reports confirmed last week — and showboated on social media by making a bed in the rear seat to appear as if they were sleeping.
Granted, the number of crashes involving Autopilot are minimal compared to the millions of miles being driven on Autopilot. According to Tesla's latest safety report, the company logged only a single crash for every 4.19 million miles driven with the feature turned on.
But that doesn't change the simple matter of fact: in 2021, Teslas cannot drive themselves. It's a message that hasn't quite hit home for many, as these collisions and stunts demonstrate.
Rather than owning up to Tesla's freewheeling approach to advertising its self-driving tech, Musk is steadfast in his belief that Teslas are better drivers than their occupants.
"Anyone paying attention to the rate of improvement will realize that Tesla Autopilot/FSD is already superhuman for highway driving," the billionaire tweeted, "and swiftly getting there for city streets."
Musk is very aware of the difficulties involved. Getting self-driving cars to behave in predictable ways and co-exist with human drivers on the road, as he acknowledged this week. And while we have seen leaps in the development of self-driving technologies, we're starting to see what happens when promises start to outpace reality.
As far as current roadways are concerned, there won't be a large reset overnight in which all human drivers are replaced with AIs — even if that's an event we would all benefit from. That means immense care must be taken while we acclimatize to a future where cars do most, but not all, driving on our behalf.
While Tesla is arguably a pioneer in the field, it's being led by a CEO who is dreaming of a distant future — but not living in one.
READ MORE: Elon Musk Shares Painfully Obvious Idea About the Difficulty of Self-Driving Cars [Gizmodo]
More on Tesla: Senator Slams Musk for Talking About Deadly Tesla Crash
Share This Article