"Navigate on Autopilot" might be causing Teslas to violate state laws.

Not Quite Human

In a biting new report, product testing nonprofit Consumer Reports found that the latest version of Tesla's Autopilot is "far less competent" than a human motorist — and that its lane-changing functionality "doesn’t work very well and could create potential safety risks for drivers."

"It’s incredibly nearsighted," said Jake Fisher, Consumer Reports’ senior director of auto testing, in the report. "It doesn’t appear to react to brake lights or turn signals, it can’t anticipate what other drivers will do, and as a result, you constantly have to be one step ahead of it."

Lane Change

Tesla's marketing strategy for its cars' artificial intelligence walks a delicate line. On one hand, CEO Elon Musk has promised that a comprehensive self-driving mode is right around the corner. At the same time, Tesla has been trying for years to remind its drivers that "Autopilot" doesn't mean the same thing as "autonomous."

But Consumer Reports wasn't happy at all with how well the feature performed, complaining that it cut off other cars, didn't leave enough space, and "even passed other cars in ways that violate state laws."

READ MORE: Tesla's Navigate on Autopilot Requires Significant Driver Intervention [Consumer Reports]

More on Autopilot: Drunk Tesla Driver Relies on Autopilot, Gets Busted by Police


Share This Article