This is terrible.
A Tesla owner bragged about using his vehicle's Full Self-Driving feature to drive home while drunk — a worrying confession that should give anybody pause.
"I was probably drunk," he added. "But with FSD, it drove me home, I mean, flawlessly."
To be abundantly clear, intentionally driving under the influence — with or without FSD — is not only illegal but flabbergastingly irresponsible, effectively gambling with the safety of others on the road.
Tesla made an unfinished beta version of its FSD software available to the broader public in November, despite plenty of well-documented shortcomings, putting other drivers who share the road at risk.
Despite its misleading name, Full Self-Driving feature does not in fact allow a Tesla to fully drive itself and is more "designed to provide more active guidance and assisted driving under your active supervision," as the company concedes on its site.
But that hasn't stopped a number of owners from abusing the system. There have been countless instances of drivers finding simple hacks to trick the feature into thinking the driver is paying attention when they're not.
Others have been caught fully asleep behind the wheel. Just last week, German police chased down a reportedly sleeping Tesla driver on the highway for 15 excruciating minutes.
Intentionally getting behind the wheel while intoxicated is just another argument of why Tesla should get ahead of a massive problem it has created.
According to the National Highway Traffic Safety Administration — itself actively investigating dozens of collisions involving Tesla's Autopilot suite — almost a third of all traffic crash fatalities in the US involve drunk drivers.
It's an egregious example of a Tesla owner vastly overestimating their vehicle's capabilities after buying into the company's misleading marketing. As such, it's a perfect example of why deploying such a system under the assumption that users will act responsibly is a terrible idea.