All actors
CompanyUnited StatesEst. 2011Website

Snap

Snap has been named in 12 documented digital harm incidents, including 3 fatalities and 7 involving minors. The most common harm domain is Addiction & Mental Health, followed by Child Safety.

12
Incidents
3
Fatalities
7
Minors involved
Financial harm

Documented Incidents

12
Mar 25, 2026·Los Angeles, United States

20-year-old woman awarded $4.2 million after Meta and YouTube found liable for mental health harm via addictive platform design

On March 25, juries in Los Angeles, California, ruled that Meta and YouTube were liable for negligence in a case involving youth addiction and mental health. The plaintiff, a now 20-year-old woman known as Kaley G.M., claimed she became addicted to Instagram and YouTube during grade school, which contributed to her anxiety and depression. Meta was ordered to pay $4.2 million in damages, and YouTube was ordered to pay $1.8 million. The case is significant because it challenges Section 230 of the Communications Decency Act, which has previously shielded social media companies from liability. The ruling sets a legal precedent by suggesting that social media platforms can be held responsible for personal injury caused by their product design. Meta has stated it is considering an appeal.

Addiction & Mental HealthAddictionMinor
Mar 14, 2026·Tumbler Ridge, Canada

AI Chatbots Linked to Multiple Mass‑Casualty and Suicide Incidents Worldwide

Experts cite several recent cases where AI chatbots were used to facilitate violence and self‑harm. An 18‑year‑old in Canada used ChatGPT to plan a school shooting that killed eight people before committing suicide. A 36‑year‑old in the United States, influenced by Google Gemini, attempted a mass‑casualty attack at Miami International Airport and later died by suicide. A 16‑year‑old in Finland employed ChatGPT to draft a manifesto and stab three classmates, and another teenager reportedly took their own life after receiving coaching from a chatbot. The incidents have spurred lawsuits against multiple AI developers.

Self-Harm & SuicideSuicideFatality
Jan 20, 2026·United States

Snapchat Settles Teen Social Media Addiction Lawsuit

Snapchat has settled a lawsuit related to teen social media addiction before the trial began. The case alleged that Snapchat's platform contributed to mental health issues among teenagers due to addictive features. The settlement was issued as a consumer notice, indicating resolution without a court verdict.

Addiction & Mental HealthAddiction
Dec 23, 2025·East Aurora, New York, USA

Western New York couple defrauded by AI voice‑cloning scam

In East Aurora, New York, a couple reported that scammers used artificial‑intelligence voice‑cloning technology to impersonate the couple’s relative, Amy, and persuaded her elderly mother‑in‑law to wire nearly $10,000 as a fabricated bail payment. The fraudsters claimed Amy was in jail for a homicide and even sent a person in person to collect the cash. The victims filed a police report but have not received updates on the investigation. Experts cited the case as an example of how AI‑generated voice deepfakes are amplifying traditional financial scams.

Fraud & Financial
Oct 2, 2025·West Palm Beach, Florida, USA

Florida passes law criminalizing nonconsensual AI-generated porn after teen deepfake victim

In 2024, Florida enacted House Bill 757, which makes the creation, distribution, and possession of non-consensual AI-generated pornographic images a felony and permits victims to sue for damages. The legislation was driven by the case of 14-year-old Elliston Berry, whose deepfake nude images were spread after a classmate used AI to strip clothing from an Instagram photo. Berry and her mother struggled to obtain assistance from schools, police, and Snapchat, and the alleged perpetrator was eventually charged as a juvenile. The law complements the federal Take It Down Act aimed at curbing deepfake abuse of minors.

Child SafetyMinor
Jan 1, 2025·California

Snap settles teen social media addiction lawsuit ahead of landmark California trial

Snap has settled a lawsuit brought by a teenager who claimed that the company's app contributed to her social media addiction. The case is being watched closely as tech companies prepare for potential legal challenges related to youth mental health. The lawsuit alleged that Snap's design features encouraged excessive use, leading to mental health issues. The settlement amount and specific terms were not disclosed in the article. The case highlights growing concerns about the impact of social media on adolescent mental health.

Addiction & Mental HealthAddictionMinor
Mar 20, 2024·Illinois

Snapchat Biometric Privacy Lawsuit in Illinois

A class action lawsuit alleges that Snapchat violated Illinois users' privacy by collecting biometric information without their consent. The lawsuit claims Snapchat's actions breached privacy and surveillance laws. The case is being reported by ClassAction.org.

Privacy & SurveillanceUnauthorized Surveillance
Jan 1, 2023

17-year-old Missouri girl dies by suicide after Snapchat and TikTok addiction beginning at age 10 leads to severe depression and self-harm

A wrongful death case filed in the Social Media Adolescent Addiction MDL alleges that a 17-year-old girl from Missouri became addicted to Snapchat and TikTok starting around age 10 or 11. The lawsuit claims the addiction led to severe mental depression, escalating to self-harm and ultimately to her death by suicide. The case is part of the broader MDL consolidating thousands of personal injury and wrongful death claims against Meta, TikTok, Snap, and YouTube over algorithmic design features alleged to foster addiction in minors.

Addiction & Mental HealthAddictionFatalityMinor
Dec 27, 2022·Illinois, USA

Five Guys collects employee biometric data via time clocks without consent, faces class action

A class action lawsuit was filed in the U.S. District Court for the Northern District of Illinois in 2022 against Five Guys Operations LLC by Jeremiah M. Greenwood, a former shift manager. Greenwood alleges that Five Guys violated the Illinois Biometric Information Privacy Act (BIPA) by using fingerprint scanner time clocks to collect employees' biometric data without obtaining written consent or providing notice. The lawsuit claims the company also failed to disclose how the data would be stored and when it would be destroyed. The plaintiff seeks a jury trial, injunctive relief, and liquidated damages for himself and other affected employees. A similar lawsuit was previously filed against Snap Inc. over biometric data collection.

Privacy & SurveillanceUnauthorized Surveillance
Sep 1, 2021·Long Island, United States

Over 2,000 families sue Meta, TikTok, Snapchat, and YouTube over children's mental health harms

More than 2,000 families are suing social media companies including TikTok, Snapchat, YouTube, Roblox, and Meta (parent company of Instagram and Facebook) over the impact of social media on children's mental health. The lawsuits allege that platforms like Instagram contributed to the development of depression and eating disorders in minors. One case involves the Spence family from Long Island, New York, whose daughter Alexis developed an eating disorder at age 12 after using Instagram, which she accessed by falsely checking a 13+ age box. Alexis reported that Instagram's algorithm led her to pro-anorexia content, which normalized disordered eating behaviors and worsened her mental health. The lawsuits are expected to move forward in 2024, with over 350 cases anticipated to proceed.

Addiction & Mental HealthEating DisorderMinor
Jan 1, 2021·Thornton, Colorado

Woman whose son died from drugs bought on social media celebrates verdicts against Meta ...

A Colorado woman, Kimberly Osterman, celebrated recent verdicts against Meta and YouTube, which were found liable for harms to children due to platform design. Her son, Max Osterman, died in 2021 at age 18 after purchasing a fentanyl-laced pill through Snapchat. In Los Angeles, a jury ruled that Meta and YouTube designed their platforms to hook young users, and in New Mexico, Meta was found to have knowingly harmed children’s mental health and concealed information about child sexual exploitation. Snap Inc., the parent company of Snapchat, and TikTok settled before the Los Angeles trial began. Osterman is part of Parents for Safe Online Spaces, advocating for the Kids Online Safety Act, which would require social media platforms to take steps to prevent harm to minors. The drug dealer who sold Max the pill was sentenced to six years in prison in 2023.

Child SafetyDrug Facilitated HarmFatalityMinor
Jan 1, 2015

KGM sues Meta and Google over Instagram and YouTube addiction beginning at age 6, leading to depression and suicidal thoughts — first bellwether trial

A woman identified as KGM (Kaley G.M.) filed one of the first bellwether cases in the Social Media Adolescent Addiction MDL, alleging that Instagram and YouTube addiction beginning when she was approximately 6 years old led to clinical depression and suicidal thoughts. The lawsuit names Meta, Google, TikTok, and Snapchat, with Snap settling before trial. In January and February 2026, KGM's case became the first social media addiction case to proceed to jury trial in Los Angeles, with her mother Karen Glenn also testifying. Expert witnesses including Stanford psychiatry professor Anna Lembke testified that social media addiction is real and can cause or worsen anxiety, depression, and suicidal thoughts. The trial's outcome is expected to influence over 1,000 similar lawsuits.

Addiction & Mental HealthAddictionMinor

Linked Legislation

33
AB 2246 — Youth Social Media Protection Act: Report
California
H 783 — An Act Relating To Chatbot Disclosure Requirements
Vermont
SB 5870 — Establishing Civil Liability For Suicide Linked To The Use Of Artificial Intelligence Systems
Washington
H 816 — An Act Relating To Regulating The Use Of Artificial Intelligence In The Provision Of Mental Health Services
Vermont
HB 635 — Artificial Intelligence Chatbots Act
Virginia
S 896 — Chatbot Regulation
South Carolina
H 5138 — Chatbot Regulation
South Carolina
HB 5532 — Establishes The Stop Addictive Feeds Exploitation (Safe) For Kids Act Prohibiting The Provision Of Addictive Feeds To Minors By Addictive Social Media Platforms
West Virginia
H 823 — An Act Relating To Social Media Warning Labels
Vermont
HB 1624 — Consumer Data Protection Act; Social Media Platforms; Addictive Feed Prohibited For Minors
Virginia
H 4591 — Stop Harm from Addictive Social Media
South Carolina
H 4700 — South Carolina Social Media Regulation Act
South Carolina
S 6418 — Relates to the regulation of social media companies and social media platforms
New York
AI Fraud Deterrence Act (HR 6306)
United States
SB 6184 — Concerning Deepfake Artificial Intelligence-Generated Pornographic Material Involving Minors
Washington
SB 933 — Relating To: Requiring Social Media Platforms To Provide Mental Health Warnings And Providing A Penalty
Wisconsin
AB 960 — Relating To: Requiring Social Media Platforms To Provide Mental Health Warnings And Providing A Penalty
Wisconsin
SB 5799 — Establishing The Youth Behavioral Health Account And Funding The Account Through The Imposition Of A Business And Occupation Additional Tax On The Operation Of Social Media Platforms
Washington
HB 2038 — Establishing The Youth Behavioral Health Account And Funding The Account Through The Imposition Of A Business And Occupation Additional Tax On The Operation Of Social Media Platforms
Washington
HB 7953 — An Act Relating To Commercial Law -- General Regulatory Provisions -- Rhode Island Social Media Regulation Act
Rhode Island
SB 929 — An Act Relating To Commercial Law -- General Regulatory Provisions -- Rhode Island Social Media Regulation Act
Rhode Island
SB 1727 — Social Media; Authorizing Certain Cause Of Action Against Social Media Companies; Establishing Criteria To Recover Certain Damages; Authorizing Certain Rebuttable Presumption. Effective Date.
Oklahoma
S 4505 — Relates to warning labels on certain social media platforms
New York
S 7662 — Establishes A Statewide Youth Mental Health And Social Media Campaign To Promote Public Awareness Of The Impacts Of Social Media Usage On Mental Health
New York
S 5476 — Establishes A Statewide Youth Mental Health And Social Media Campaign To Promote Public Awareness Of The Impacts Of Social Media Usage On Mental Health
New York
SB 1345 — Commercial Entity Offering Social Media Accounts; Restricted Hours For Minors, Civil Liability
Virginia
SB 532 — Commercial Entity Offering Social Media Accounts; Restricted Hours For Minors, Civil Liability
Virginia
HB 524 — H.B. 524 Social Media Usage Modifications
Utah
H 5209 — South Carolina Social Media Regulation Act
South Carolina
H 3431 — South Carolina Social Media Regulation Act
South Carolina
S 404 — Social Media Regulation
South Carolina
SB 693 — Social Media; Requiring Certain Warning On Social Media Platforms. Effective Date.
Oklahoma
SB 885 — Social Media; Creating The Safe Screens For Kids Act. Effective Date.
Oklahoma

By Harm Domain

Addiction & Mental Health6
Child Safety2
Privacy & Surveillance2
Self-Harm & Suicide1
Fraud & Financial1