A provocative manslaughter case is about to kick off in Los Angeles later this month, involving a fatal crash caused by a Tesla vehicle that had the company's controversial Autopilot feature turned on.

It's the first case of its kind, and one that could set a precedent for future crashes involving cars and driver-assistance software, Reuters reports.

We won't know the exact defense until the case gets under way, but the crux is that the man who was behind the wheel of the Tesla is facing manslaughter charges — but has pleaded not guilty, setting up potentially novel legal arguments about culpability in a deadly collision when, technically speaking, it wasn't a human driving the car.

"Who's at fault, man or machine?" asked Edward Walters, an adjunct professor at the Georgetown University, in an interview with Reuters. "The state will have a hard time proving the guilt of the human driver because some parts of the task are being handled by Tesla."

The upcoming trial is about a fatal collision that took place in 2019. The crash involved Kevin George Aziz Riad, who ran a red light in his Tesla Model S, and collided with a Honda Civic, killing a couple who were reportedly on their first date.

According to vehicle data, Riad did not apply the brakes but had a hand on the steering wheel. Perhaps most critically, though, the Tesla's Autopilot feature was turned on in the moments leading up to the crash.

Riad is facing manslaughter charges, with prosecutors arguing his actions were reckless.

Meanwhile, Riad's lawyers have argued that he shouldn't be charged with a crime, but have so far stopped short of publicly placing blame on Tesla's Autopilot software.

Tesla is not directly implicated in the upcoming trial and isn't facing charges in the case, according to Reuters.

A separate trial, however, involving the family of one of the deceased is already scheduled for next year — but this time, Tesla is the defendant.

"I can't say that the driver was not at fault, but the Tesla system, Autopilot, and Tesla spokespeople encourage drivers to be less attentive," the family's attorney Donald Slavik told Reuters.

"Tesla knows people are going to use Autopilot and use it in dangerous situations," he added.

Tesla is already under heavy scrutiny over its Autopilot and so-called Full Self-Driving software, despite conceding that the features "do not make the vehicle autonomous" and that drivers must remain attentive of the road at all times.

Critics argue that Tesla's marketing is misleading and that it's only leading to more accidents — not making the roads safer, as Tesla CEO Elon Musk has argued in the past.

In fact, a recent survey found that 42 percent of Tesla Autopilot said they feel "comfortable treating their vehicles as fully self-driving."

Regulators are certainly already paying attention. The news comes a week after Reuters revealed that the Department of Justice is investigating Tesla over Autopilot.

Last year, the National Highway Traffic Safety Administration (NHTSA) announced an investigation of accidents in which Teslas have smashed into emergency response vehicles that were pulled over with sirens or flares.

This month's trial certainly stands the chance of setting a precedent. Was Riad fully at fault or was Tesla's Autopilot at least partially to blame as well?

The answer now lies in the hands of a jury.

READ MORE: Tesla crash trial in California hinges on question of 'man vs machine' [Reuters]

More on Autopilot: Survey: 42% of Tesla Autopilot Drivers Think Their Cars Can Drive Themselves


Share This Article