A 42-year-old Tesla driver, who at first denied having killed a woman with his Tesla in a hit-and-run, is now claiming that he can't remember if he ran her down or not. If he did, he says, he must've been on Autopilot and "checking work emails" while doing so.
It's a bizarre defense strategy that highlights the many glaring gaps in the legal frameworks when it comes to driver assistance software and how these features, despite their considerable limitations, are being used to avoid blame.
As Minnesota newspaper the Star Tribune reports, a late January affidavit revealed that the driver's cellphone was near Lake Mille Lacs, north of Minneapolis, when a car fatally struck 56-year-old Cathy Ann Donovan, who was walking her dog along a nearby highway in November, before fleeing the scene.
The case against the Tesla driver, who initially denied having hit her, has only been building since then. For one, police found a windshield wiper near her body, and surveillance footage of a gray 2022 Tesla Model X lined up with his cellphone records. Investigators also found light damage to the front passenger side of the vehicle and have since collected hair samples from three locations, per the report.
"I think for sure we've established probable cause," local county sheriff Kyle Burton told the Star Tribune.
It's important to note that charges have not been filed yet. The Tesla driver's attorney David Risk, however, is already resorting to a bizarre line of argument.
"My client voluntarily spoke to investigators, and he explained it is probable his car would've been using Tesla's full self-driving capability," he told the newspaper, referring to the EV maker's infamous driver assistance add-on. "He will continue to fully cooperate with this investigation until its completion."
According to court filings, the driver "maintained that he doesn't remember hitting Cathy Donovan with his Tesla, but if he did, he would have been alone in his Tesla driving on 'Autopilot,' not paying attention to the road, while doing things like checking work emails."
Who doesn't remember fatally striking a person on the side of a highway? It's a baffling defense that underlines the strangeness of driver-assist features in the real world.
Despite the Elon Musk-led company's misleading marketing, Tesla vehicles are still far from being able to drive themselves, and drivers must be "ready to take immediate action including braking," as the company points out on its website.
The carmaker is already being investigated by the National Highway Traffic Safety Administration following a series of accidents in which Teslas have smashed into emergency response vehicles that were pulled over with sirens or flares.
The number of known deaths involving Tesla's Autopilot has also surged, with the regulator's June analysis revealing that there have been at least 736 crashes in the US that involved the EV maker's controversial driver assistance feature since 2019, at least 17 of which were fatal.
More on Tesla: Tesla Is Officially the Worst-Performing S&P 500 Stock of the Year
Share This Article