For years, Tesla CEO Elon Musk has refused to use radar and lidar sensors on his self-driving cars, saying they're a "crutch." He claims it's not just a cost issue, but is adamant that only using visual cameras to let Tesla's autonomous driving software see the world around it is actually safer than the alternative.
It's a topic Musk has been revisiting lately. Last week, he attacked his rivals like Waymo for relying on lidar and radar along with vision — even though the Google-backed company's cars are currently considered to be far more autonomous than Tesla's. He claimed that "Waymos can't drive on highways," which isn't true: the company has been testing the robotaxis on highways in Phoenix, Arizona, since 2024 with a human employee behind the wheel, but isn't taking paying customers yet.
Anyway, this is supposed to support Musk's argument that adding lidar and radar "reduce safety due to sensor contention."
"If lidars/radars disagree with cameras," Musk asked on Twitter, "which one wins?"
"This sensor ambiguity causes increased, not decreased, risk," he concluded. "That's why Waymos can't drive on highways. We turned off radars in Teslas to increase safety."
We'll unpack those arguments in a moment. But first, consider this reporting from Electrek, which reveals private messages from Musk showing him clearly contradicting the arguments he's telling the public — meaning, logically, that he's fibbing either in public or private.
The messages date back to May 2021, when Tesla decided to stop using radar sensors in its cars after depending on them for over half a decade. At the time, Musk argued in a Twitter conversation with Electrek that radar was making its cars less safe, but then turned around and said that having it would actually be better than cameras.
"A very high resolution radar would be better than pure vision, but such a radar does not exist," Musk wrote at the time. (He then clarified to say he meant to say that high res radar used with cameras would better).
According to Amir Husain, an AI entrepreneur and advisory council member at UT Austin's department of computer science, this is a load of hogwash.
"The issue isn’t a binary disagreement between two sensors. It generates a better estimate than any individual sensor can produce on its own," he wrote in a tweet, flagged by Electrek. "If Musk’s argument held, why would the human brain use eyes, ears, and touch to estimate object location? Why would aircraft combine radar, IRST, and other passive sensors to estimate object location?"
"This is a fundamental misunderstanding of information theory," he added.
But haven't cameras have gotten Tesla this far? Yes, but it's also landed it in heaps of trouble as its self-driving software is repeatedly involved in close-calls and hundreds of crashes, some of which have been deadly. It recently lost a massive lawsuit that found it to be partially responsible for the death of a young woman after a Tesla running Autopilot struck a vehicle she was standing next to. Though the driver was distracted, the Autopilot system blew through an intersection at 60 miles per hour.
The most tragic and clear example of how a camera-only approach can go wrong, however, was when a Tesla in Full Self-Driving mode struck and killed a 71-year-old grandmother. The car blazed down a highway and was completely oblivious to traffic coming to a stop around it because of a crash ahead. It turns out the vehicle's camera was half-blinded by the glare of the sunset, so the software apparently didn't recognize that an old woman had stepped out of her car to help direct traffic.
Had the car been equipped with lidar and radar sensors, things might've turned out differently. But to quote Musk from his recent Waymo attack: "Cameras ftw."
Musk has a history of making outrageous claims about the capabilities of his companies' tech. For starters, he's promised that Tesla will achieve a fully autonomous driving system "next year" for over a decade — and it's still not even close to doing that. He also promised that Tesla was ready to launch a driverless robotaxi service this summer. But when it finally rolled out after delays, it transpired that all the rides required the presence of a human "safety monitor."
Critics have argued that part of the reason the company is struggling so much to refine its self-driving tech is because of Musk's refusal to use lidar or radar. And even though most of his competitors have embraced it, he seems to think that a good enough lidar or radar system "does not exist" simply because Tesla couldn't get its own efforts to work.
More on Tesla: Tesla Announces Plans to Give Elon Musk $1 Trillion
Share This Article