Tesla
Tesla has been named in 15 documented digital harm incidents, including 7 fatalities and 1 involving minor. The most common harm domain is Autonomous Systems, followed by Fraud & Financial.
Documented Incidents
15NHTSA intensifies probe into Tesla Full Self-Driving after pedestrian fatality in low-visibility conditions
The National Highway Traffic Safety Administration (NHTSA) has intensified its investigation into Tesla's Full Self-Driving (FSD) system following a fatal crash that occurred in low visibility conditions. The incident took place in the United States, though the exact location and date of the crash were not specified in the article. The crash resulted in a fatality, prompting renewed scrutiny of Tesla's autonomous driving technology. NHTSA is examining whether the FSD system functioned as intended during the incident. The probe is part of a broader review of Tesla vehicles equipped with FSD, which has been under investigation for multiple crashes. No specific policy or legislative action was mentioned in the article.
Tesla settles with family of 15‑year‑old Jovani Maldonado over 2019 San Francisco Autopilot crash
Tesla reached a settlement with the family of Jovani Maldonado, a 15‑year‑old who died in a 2019 San Francisco collision involving a Tesla operating on Autopilot. The agreement follows a Florida jury verdict that ordered Tesla to pay $243 million to the family of pedestrian Nabeil Benavides killed in a similar Autopilot crash. Both cases have heightened regulatory scrutiny of Tesla’s self‑driving technology, including a lawsuit by the California DMV challenging Tesla’s advertising. The settlement details were not disclosed, but the case underscores mounting legal pressure on Tesla as it advances its Robotaxi ambitions.
Jury orders Tesla to pay $240 million for 2019 Florida Autopilot crash
A Miami federal jury found Tesla partially liable for a 2019 fatal crash in Key Largo, Florida, involving its Autopilot driver‑assist system. The crash killed 22‑year‑old Naibel Benavides Leon and seriously injured her boyfriend, Dillon Angulo, who suffered broken bones and a traumatic brain injury. The jury ordered Tesla to pay more than $240 million in damages, citing the company's alleged withholding of data and video evidence. Legal experts say the verdict could open the floodgates for future lawsuits against autonomous‑vehicle manufacturers.
Former Uber self-driving head nearly killed when Tesla FSD failed to stop for obstacle
Uber’s former head of self-driving, Raffi Krikorian, now Mozilla’s CTO, nearly died while using Tesla’s Full Self-Driving (FSD) system. He described the incident in an article for The Atlantic. Tesla’s FSD system functions well most of the time, according to the company’s data, but Krikorian argues this near-flawless performance can lull human drivers into a false sense of security. The incident has led him to reconsider the relationship between humans and autonomous systems. The article highlights concerns about the risks of relying on systems that are "almost perfect," which may reduce human vigilance.
Xiaomi SU7 autonomous vehicle crash kills three university students in China
On March 29, 2025, a Xiaomi SU7 autonomous vehicle collided with a concrete barrier in China, killing three university students. The vehicle was traveling at approximately 97 km/h, and the emergency braking system failed after the driver attempted to regain manual control. An investigation identified the lack of lidar sensors and ineffective automated braking as key contributors, and Xiaomi accepted responsibility, cooperating with authorities while its market value fell by over $16 billion.
Driver killed in Washington crash while using Tesla self-driving system
A Tesla vehicle involved in a fatal crash in Washington was using its self-driving system at the time of the incident. The crash occurred in Washington, though the exact location and date were not specified in the article. The article highlights the use of autonomous systems in the incident. The consequences included a fatality, raising concerns about the safety and reliability of self-driving technology.
Scammers use AI deepfake to steal $25M from engineering firm Arup
In early 2024, scammers employed AI‑generated deepfake video and audio to impersonate the chief financial officer of British engineering firm Arup. The fraudsters convinced an employee to transfer roughly HK$200 million (about US$25 million) to five Hong Kong bank accounts. Hong Kong police identified the scheme after the employee reported the transfers, and Arup notified authorities. Experts warned that generative AI is lowering the barrier for sophisticated financial fraud and urged CFOs to adopt stricter verification controls.
NHTSA opens investigation into Tesla Full Self-Driving after Arizona fatal crash
The U.S. National Highway Traffic Safety Administration launched a probe into Tesla's Full Self‑Driving (FSD) software following four low‑visibility crashes, one of which on November 27, 2023 in Rimrock, Arizona killed a 71‑year‑old pedestrian. The incident involved a Tesla Model Y operating on FSD that collided with a stopped Toyota 4Runner and the pedestrian, with sun glare cited as a contributing factor. The investigation will examine whether the system can reliably detect and respond to reduced roadway visibility such as glare, fog and dust. The probe adds to mounting regulatory scrutiny of Tesla's camera‑only approach after prior FSD recalls.
Jeffrey Nissen killed by Tesla operating on Autopilot on Washington State Route 522
A Tesla vehicle was involved in a fatal crash in Washington state while using its self-driving system, according to a report by Transport Topics. The incident occurred in 2023, though the exact date is not specified in the provided text. The crash resulted in a fatality. The involvement of Tesla's autonomous driving system is a key factor in the incident. Investigations are likely underway to determine the cause and whether the system was functioning as intended.
Tesla in full self-driving mode abruptly brakes on Bay Bridge causing eight-car pileup injuring nine
A surveillance video revealed that a Tesla in 'full self-driving' mode abruptly braked, causing an 8-car crash on the San Francisco Bay Bridge on Thanksgiving Day. The incident resulted in nine injuries, including a 2-year-old boy. The driver, a 76-year-old lawyer, reported the incident to authorities. Tesla had recently launched a Beta version of its full self-driving feature. The National Highway Traffic Safety Administration is investigating the crash and other incidents involving advanced driver assistance systems.
Tesla Worker Killed in Fiery Crash Involving Full Self-Driving Feature
A Tesla worker was killed in a fiery crash that may be the first fatality involving the company's 'Full Self-Driving' feature. The incident has raised concerns about the safety and reliability of autonomous driving systems. The victim was affiliated with Tesla, adding to the controversy surrounding the use of autonomous vehicle technology.
NHTSA opens formal investigation into Tesla Autopilot crashes involving emergency vehicles
The National Highway Traffic Safety Administration (NHTSA) has launched a formal investigation into Tesla vehicles operating in Autopilot mode that have collided with emergency vehicles. The investigation is examining whether Tesla's Autopilot system failed to detect and avoid emergency vehicles in these incidents. No specific date or location is provided in the article, but the focus is on a new regulatory probe into Tesla's autonomous driving system.
Two men killed in driverless Tesla crash in Spring, Texas after vehicle strikes tree and catches fire
Two men died in a Tesla crash in Spring, Texas, where no one was found behind the wheel, according to local police. The 2019 Tesla Model S crashed into a tree and caught fire, with one person in the front passenger seat and another in the rear. Preliminary investigations suggest no driver was present at the time of the crash. The incident has raised questions about Tesla's Autopilot and Full Self-Driving (FSD) systems, which are not fully autonomous. The National Highway Traffic Safety Administration (NHTSA) has launched a special investigation into the crash.
Tesla Driver Dies in Florida Crash After Ignoring Safety Warnings
A Tesla driver in Florida died in a crash after reportedly ignoring at least seven safety warnings from the vehicle's autonomous systems. The incident raises concerns about driver reliance on autonomous features and adherence to safety protocols. The driver's failure to respond to repeated alerts contributed to the fatal crash.
Tesla Driver Killed While Using Autopilot and Watching Harry Potter
A Tesla driver was killed while using the Autopilot feature and was reportedly watching the movie Harry Potter at the time, according to a witness. The incident raises concerns about driver distraction and the use of autonomous systems in vehicles. Details about the location and date of the incident remain unknown.