Horrible.

Tragedy on Tarmac

In 2022, a motorcyclist in Utah was killed in a crash with a Tesla Model 3 that was running Autopilot.

Now the parents of the deceased have sued the automaker and the vehicle's driver, Reuters reports, marking yet another legal case that's been mounted against Tesla and its self-driving software.

The plaintiffs contend that Autopilot, which is a driver assistance system and is not fully autonomous — even if its name implies otherwise — as well as other Tesla safety features equipped on its cars are "defective and inadequate."

Both to Blame

The motorcyclist, 34-year-old Landon Embry, was killed after the Model 3 rear-ended his Harley Davidson at around 75 or 80 miles per hour, launching him off his bike, the lawsuit said. The crash took place around 1 am on an interstate road.

Autopilot "should have identified the hazard posed by [Embry's] motorcycle in its presence," the complaint states, as quoted by Reuters.

Embry's family also believe that the driver is at fault, claiming that they were "tired" and "not in a condition to drive as an ordinarily prudent driver."

"A reasonably prudent driver, or adequate auto braking system, would have, and could have slowed or stopped without colliding with the motorcycle," the complaint added.

Collision Course

This isn't the only time that self-driving Teslas have been involved in deadly crashes with motorcycles. In April, a Seattle motorcyclist was killed by a Tesla Model S in a similar rear-end collision. Police recently determined that the car was in Full Self-Driving mode at the time of the incident.

More broadly, the automaker's Autopilot and Full Self-Driving systems have been linked to hundreds of crashes, and at least 17 to 19 deaths.

Controversy over the safety of the systems, as well as over the misleading nature of their branding that could imply the cars are fully autonomous, has made Tesla the subject of numerous probes by regulators.

The National Highway and Traffic Safety Administration has consistently investigated Autopilot and FSD crashes, including collisions with emergency vehicles. There's also an ongoing probe by the Department of Justice, which will determine if Tesla committed securities and wire fraud by lying about the driving systems' capabilities.

Where the blame falls in scenarios involving autonomous driving software remains a legally gray area. We can only hope that the family can find some justice.

More on Tesla: Tesla Buyers Having More Regrets After Actually Using Their Cars, Research Finds


Share This Article