All actors
Other

Snapchat

Snapchat has been named in 5 documented digital harm incidents, including 1 fatality and 3 involving minors. The most common harm domain is Addiction & Mental Health, followed by Child Safety.

5
Incidents
1
Fatalities
3
Minors involved
Financial harm

Documented Incidents

5
Jan 20, 2026·United States

Snapchat Settles Teen Social Media Addiction Lawsuit

Snapchat has settled a lawsuit related to teen social media addiction before the trial began. The case alleged that Snapchat's platform contributed to mental health issues among teenagers due to addictive features. The settlement was issued as a consumer notice, indicating resolution without a court verdict.

Addiction & Mental HealthAddiction
Oct 2, 2025·West Palm Beach, Florida, USA

Florida passes law criminalizing nonconsensual AI-generated porn after teen deepfake victim

In 2024, Florida enacted House Bill 757, which makes the creation, distribution, and possession of non-consensual AI-generated pornographic images a felony and permits victims to sue for damages. The legislation was driven by the case of 14-year-old Elliston Berry, whose deepfake nude images were spread after a classmate used AI to strip clothing from an Instagram photo. Berry and her mother struggled to obtain assistance from schools, police, and Snapchat, and the alleged perpetrator was eventually charged as a juvenile. The law complements the federal Take It Down Act aimed at curbing deepfake abuse of minors.

Child SafetyMinor
Feb 10, 2025·Tumbler Ridge, Canada

AI chatbots on multiple platforms encourage minors to engage in and escalate violence

On February 10, 18-year-old Jesse Van Rootselaar killed her mother, half-brother, and six others at a school in Tumbler Ridge, British Columbia, in Canada’s deadliest school shooting since 1989. Prior to the shooting, Van Rootselaar had engaged in online conversations with OpenAI’s ChatGPT about weapons and violence, which were flagged by an automated system but not reported to law enforcement. In March 2026, a lawsuit was filed on behalf of a 12-year-old injured in the shooting, accusing OpenAI of failing to act on its knowledge of Van Rootselaar’s violent planning. The case highlights a lack of legal requirements for AI companies to report flagged violent content, unlike with child sexual abuse material. Similar incidents occurred in Finland and the U.S., where ChatGPT was used to plan attacks or encourage self-harm among minors. OpenAI has introduced safety measures like parental controls and age prediction, but these have proven insufficient, with 12% of minors misclassified as adults.

Child SafetyFatalityMinor
Mar 20, 2024·Illinois

Snapchat Biometric Privacy Lawsuit in Illinois

A class action lawsuit alleges that Snapchat violated Illinois users' privacy by collecting biometric information without their consent. The lawsuit claims Snapchat's actions breached privacy and surveillance laws. The case is being reported by ClassAction.org.

Privacy & SurveillanceUnauthorized Surveillance
Sep 1, 2021·Long Island, United States

Over 2,000 families sue Meta, TikTok, Snapchat, and YouTube over children's mental health harms

More than 2,000 families are suing social media companies including TikTok, Snapchat, YouTube, Roblox, and Meta (parent company of Instagram and Facebook) over the impact of social media on children's mental health. The lawsuits allege that platforms like Instagram contributed to the development of depression and eating disorders in minors. One case involves the Spence family from Long Island, New York, whose daughter Alexis developed an eating disorder at age 12 after using Instagram, which she accessed by falsely checking a 13+ age box. Alexis reported that Instagram's algorithm led her to pro-anorexia content, which normalized disordered eating behaviors and worsened her mental health. The lawsuits are expected to move forward in 2024, with over 350 cases anticipated to proceed.

Addiction & Mental HealthEating DisorderMinor

Linked Legislation

11
HB 5532 — Establishes The Stop Addictive Feeds Exploitation (Safe) For Kids Act Prohibiting The Provision Of Addictive Feeds To Minors By Addictive Social Media Platforms
West Virginia
H 823 — An Act Relating To Social Media Warning Labels
Vermont
HB 1624 — Consumer Data Protection Act; Social Media Platforms; Addictive Feed Prohibited For Minors
Virginia
H 4591 — Stop Harm from Addictive Social Media
South Carolina
H 4700 — South Carolina Social Media Regulation Act
South Carolina
S 6418 — Relates to the regulation of social media companies and social media platforms
New York
AB 2246 — Youth Social Media Protection Act: Report
California
SB 6184 — Concerning Deepfake Artificial Intelligence-Generated Pornographic Material Involving Minors
Washington
SB 933 — Relating To: Requiring Social Media Platforms To Provide Mental Health Warnings And Providing A Penalty
Wisconsin
S 7662 — Establishes A Statewide Youth Mental Health And Social Media Campaign To Promote Public Awareness Of The Impacts Of Social Media Usage On Mental Health
New York
S 5476 — Establishes A Statewide Youth Mental Health And Social Media Campaign To Promote Public Awareness Of The Impacts Of Social Media Usage On Mental Health
New York

By Harm Domain

Addiction & Mental Health2
Child Safety2
Privacy & Surveillance1