Chevron icon It indicates an expandable section or menu, or sometimes previous / next navigation options. HOMEPAGE

In a first, a Tesla driver is facing felony charges for a fatal Autopilot crash

A red Tesla Model 3 in a showroom.
Tesla's Autopilot system has come under fire from federal regulators over safety concerns. Tang Ke/Costfoto/Barcroft Media via Getty Images

  • A Tesla driver was charged with manslaughter following a crash involving Autopilot, the AP reports
  • It appears to be the first time a fatal Autopilot crash has led to felony charges for the driver. 
  • Tesla's driver-assistance features are under intense scrutiny by federal safety regulators. 
Advertisement

A Tesla driver in California was charged with two counts of vehicular manslaughter after crashing into another car and killing two people while his car was on Autopilot, the Associated Press first reported on Tuesday.

It appears to be the first time a driver using semi-automated driving technology has been charged with a felony in relation to a deadly crash, the AP said. State prosecutors filed charges against the driver, Kevin George Aziz Riad, in October, court records show, though detailed documents were only recently released.

The deadly events unfolded in a Los Angeles suburb in December 2019, when Riad's Model S sedan left the freeway, ran a red light, and crashed into a Honda Civic, the AP said, citing police reports. Two of the Civic's occupants were killed, and their families are suing Tesla and Riad separately from the criminal charges, the AP reported.

Riad has pleaded not guilty and is currently free on bail. A preliminary hearing is scheduled for February 23, court documents show. According to the AP, prosecutors did not mention Autopilot, but the National Highway Traffic Safety Administration confirmed the technology was engaged during the crash. 

Advertisement

The charges come amid intense scrutiny of Autopilot, Tesla's advanced driver-assistance system that uses an array of cameras to maintain a vehicle's speed, follow curves in the road, and keep a set distance to the car ahead.

Autopilot doesn't make vehicles autonomous — no commercially available technology does — and critics say it's too easy to abuse by not paying attention to the road. Safety advocates have also said Tesla's branding of Autopilot and a more advanced feature, Full Self-Driving, is misleading and overstates their capabilities. 

NHTSA has long investigated Autopilot's shortcomings. It's currently looking into a dozen crashes where Teslas with driver-assistance features switched on barreled into stopped emergency vehicles. The family of a 15-year-old boy who was killed when a Tesla crashed into his family's pickup truck is suing the electric automaker, alleging that Autopilot was partially to blame. 

Tesla did not respond to Insider or the AP's request for comment. The company says that drivers need to pay full attention when Autopilot is engaged and maintains that the technology makes roads safer

Advertisement

Do you have a story to share about your experience with Tesla Autopilot or other driver-assistance technology? Contact this reporter at tlevin@businessinsider.com

Transportation Tech Tesla
Advertisement
Close icon Two crossed lines that form an 'X'. It indicates a way to close an interaction, or dismiss a notification.

Jump to

  1. Main content
  2. Search
  3. Account