Waymo
Waymo has been named in 6 documented digital harm incidents, including 3 fatalities and 3 involving minors. The most common harm domain is Autonomous Systems, followed by Child Safety.
Documented Incidents
6Waymo robotaxi strikes child outside Santa Monica elementary school, federal investigation launched
In Santa Monica, California, a Waymo driverless robotaxi traveling at approximately 35 mph collided with a juvenile who opened the vehicle’s door outside an elementary school. The teen sustained injuries but survived, and emergency responders were called to the scene. The crash triggered a federal investigation into Waymo’s safety practices, adding to ongoing scrutiny such as the NTSB probe into the company’s behavior around school buses. Experts warned that higher speeds could have resulted in a far more severe outcome.
Waymo recalls over 3,000 autonomous vehicles after software allowed passing stopped school buses
Waymo, the autonomous‑vehicle unit of Alphabet, announced a recall of 3,067 robotaxis after the National Highway Traffic Safety Administration identified a software defect that caused the cars to drive around stopped school buses, ignoring flashing red lights and extended stop arms. The issue was uncovered following 20 reported incidents in Austin, Texas, and six similar cases in Atlanta, leading NHTSA to issue a recall notice on November 8 2025. Waymo deployed a software fix by November 17, affecting its fifth‑generation automated driving systems deployed in multiple U.S. cities. The recall highlights safety concerns for driverless ride‑hailing services.
Tesla settles with family of 15‑year‑old Jovani Maldonado over 2019 San Francisco Autopilot crash
Tesla reached a settlement with the family of Jovani Maldonado, a 15‑year‑old who died in a 2019 San Francisco collision involving a Tesla operating on Autopilot. The agreement follows a Florida jury verdict that ordered Tesla to pay $243 million to the family of pedestrian Nabeil Benavides killed in a similar Autopilot crash. Both cases have heightened regulatory scrutiny of Tesla’s self‑driving technology, including a lawsuit by the California DMV challenging Tesla’s advertising. The settlement details were not disclosed, but the case underscores mounting legal pressure on Tesla as it advances its Robotaxi ambitions.
Waymo driverless robotaxi involved in first fatal U.S. crash in San Francisco
A Waymo robotaxi stopped at a traffic light was rear‑ended in a multi‑vehicle collision at the intersection of 6th and Harrison Streets in San Francisco, resulting in the death of a passenger in another vehicle and a dog, and injuring seven others. This marks the first fatal incident in the United States involving a fully autonomous vehicle with no human driver present. Authorities, including the San Francisco Police Department and the National Highway Traffic Safety Administration, are investigating the crash, while Waymo maintains the autonomous car was not at fault. The incident highlights safety and regulatory concerns surrounding driverless car deployments.
Waymo robotaxis collide with gates and chains in 16 incidents, triggering NHTSA investigation and recall
Between December 2022 and April 2024, Waymo autonomous vehicles experienced at least seven low-speed collisions with stationary and semi-stationary objects such as gates and chains — obstacles that human drivers would normally avoid. NHTSA opened a preliminary investigation in May 2024. Waymo subsequently reported nine additional collisions with similar barriers, bringing the total to 16. No injuries were reported. The failures were attributed to the fifth-generation Automated Driving System's inability to reliably classify thin or semi-stationary objects under certain conditions. Waymo deployed a software fix in November 2024 and formally issued a recall for 1,212 vehicles in May 2025 after the fix had already been applied. The recall followed two earlier 2024 Waymo software recalls: one in February after two robotaxis struck the same tow truck, and one in June after a vehicle hit a telephone pole in Phoenix.
49-year-old woman struck and killed by Uber self-driving vehicle in Tempe, Arizona
Uber shut down its self-driving operations and autonomous vehicle (AV) research in Arizona in May 2018 following a fatal crash in Tempe on March 18, 2018, in which a pedestrian was struck and killed by an Uber AV. Arizona Governor Doug Ducey ordered an indefinite suspension of Uber’s AV testing in the state after reviewing video footage of the incident, which showed the safety driver not paying attention to the road. The shutdown affected around 200 employees in Arizona, many of whom were safety drivers. Uber stated it would continue AV research in Pittsburgh and San Francisco but would adopt a more limited testing approach once proper permits and safety improvements were in place. The incident and subsequent actions had significant implications for Uber’s AV initiatives and its reputation in the autonomous vehicle industry.