Google’s Scans of Private Photos Led to False Accusations of Child Abuse - Electronic Frontier Foundation
Summary
Google's automated scanning system falsely accused two fathers of child abuse by misidentifying photos of their children's medical conditions as child sexual abuse material (CSAM). The company reported the parents to authorities without informing them, leading to police investigations. Despite being cleared by local police, Google refused to restore the fathers' accounts or return their data. The incident highlights flaws in Google's AI and human review processes, and raises concerns about the broader impact of inaccurate CSAM scanning, including potential harm to users and the risk of false accusations. Other companies like Facebook and LinkedIn have also reported high error rates in their CSAM scanning systems.
Incident Details
Harms involving the exploitation, abuse, or endangerment of minors, including CSAM and grooming.
Child sexual abuse material — the creation, distribution, or possession of such content.