YouTube
YouTube has been named in 10 documented digital harm incidents, including 2 fatalities and 5 involving minors. The most common harm domain is Addiction & Mental Health, followed by Fraud & Financial.
Documented Incidents
1020-year-old woman awarded $4.2 million after Meta and YouTube found liable for mental health harm via addictive platform design
On March 25, juries in Los Angeles, California, ruled that Meta and YouTube were liable for negligence in a case involving youth addiction and mental health. The plaintiff, a now 20-year-old woman known as Kaley G.M., claimed she became addicted to Instagram and YouTube during grade school, which contributed to her anxiety and depression. Meta was ordered to pay $4.2 million in damages, and YouTube was ordered to pay $1.8 million. The case is significant because it challenges Section 230 of the Communications Decency Act, which has previously shielded social media companies from liability. The ruling sets a legal precedent by suggesting that social media platforms can be held responsible for personal injury caused by their product design. Meta has stated it is considering an appeal.
voice_cloning_fraud: AI-generated covers of songs uploaded to Spotify profile — Spotify, YouTube
Independent folk musician Murphy Campbell discovered that AI-generated covers of her songs had been uploaded to her Spotify profile without her permission. The AI voice clones were created using recordings scraped from her YouTube channel and uploaded under her name, constituting copyright fraud and unauthorized use of her likeness.
Gautam Gambhir files lawsuit seeking ₹2.5 crore after deepfake used to impersonate him
India's cricket head coach Gautam Gambhir filed a civil suit in the Delhi High Court in late 2025, seeking ₹2.5 crore in damages for the unauthorized use of his name, image, and voice in deepfake content. The case involves 16 defendants, including social media accounts, e-commerce platforms like Amazon and Flipkart, and tech companies such as Meta, Google, and YouTube. Gambhir's legal team claims that fabricated videos, including one falsely showing his resignation, have circulated widely on social media and been used for financial gain. The case is being heard under the Copyright Act, 1957, the Trade Marks Act, 1999, and the Commercial Courts Act, 2015, and seeks immediate removal of the content and a permanent injunction against future misuse. Legal experts suggest the case could set a precedent for protecting digital personality rights in India amid rising concerns over AI-driven fraud and misinformation.
Los Angeles jury finds Meta and Google liable for social media addiction harming Kaley
A jury in a landmark social media addiction trial in Los Angeles is deliberating whether Meta or YouTube is liable for the mental health issues of a 20-year-old woman, identified as Kaley G.M., who claims the platforms contributed to her depression and suicidal thoughts as a child. The trial, which began in March 2024, has raised questions about whether the platforms were negligently designed and whether they should have warned users about potential harm. Kaley testified that she became addicted to YouTube and Instagram starting at age six, though she also described family-related trauma. The case could set a precedent for thousands of similar lawsuits, as it challenges the legal protection provided by Section 230 of the US Communications Decency Act. The jury is considering whether Meta or YouTube were "substantial factors" in causing Kaley’s mental health struggles and how much in damages should be awarded. The trial highlights growing concerns about the impact of social media on vulnerable young users and the responsibility of tech companies for harmful content and design.
AI-generated child sexual abuse material overwhelms law enforcement in Indiana
Law enforcement agencies in Indiana are struggling to manage a surge in AI-generated child sexual abuse material (CSAM). Cases include a Fishers pastor's son accused of creating AI-generated photos of nude pregnant toddlers, an Elwood school custodian altering a student's Instagram photo, and a 71-year-old Evansville man convicted of using AI to generate explicit images of children under 12. Reports of AI-fueled CSAM increased from 4,700 in 2023 to over 1 million in the first nine months of 2025, according to the National Center for Missing and Exploited Children. These reports are sent to Indiana State Police’s Internet Crimes Against Children Task Force for investigation. Prosecutors and law enforcement warn that the growing volume of AI-generated content is overwhelming already overburdened forensic teams and that additional funding and resources are needed to address the crisis.
Over 2,000 families sue Meta, TikTok, Snapchat, and YouTube over children's mental health harms
More than 2,000 families are suing social media companies including TikTok, Snapchat, YouTube, Roblox, and Meta (parent company of Instagram and Facebook) over the impact of social media on children's mental health. The lawsuits allege that platforms like Instagram contributed to the development of depression and eating disorders in minors. One case involves the Spence family from Long Island, New York, whose daughter Alexis developed an eating disorder at age 12 after using Instagram, which she accessed by falsely checking a 13+ age box. Alexis reported that Instagram's algorithm led her to pro-anorexia content, which normalized disordered eating behaviors and worsened her mental health. The lawsuits are expected to move forward in 2024, with over 350 cases anticipated to proceed.
Woman whose son died from drugs bought on social media celebrates verdicts against Meta ...
A Colorado woman, Kimberly Osterman, celebrated recent verdicts against Meta and YouTube, which were found liable for harms to children due to platform design. Her son, Max Osterman, died in 2021 at age 18 after purchasing a fentanyl-laced pill through Snapchat. In Los Angeles, a jury ruled that Meta and YouTube designed their platforms to hook young users, and in New Mexico, Meta was found to have knowingly harmed children’s mental health and concealed information about child sexual exploitation. Snap Inc., the parent company of Snapchat, and TikTok settled before the Los Angeles trial began. Osterman is part of Parents for Safe Online Spaces, advocating for the Kids Online Safety Act, which would require social media platforms to take steps to prevent harm to minors. The drug dealer who sold Max the pill was sentenced to six years in prison in 2023.
18-year-old girl dies by suicide after using Meta and YouTube platforms
In 2020, an 18-year-old named Annalee Schott took her own life, which her family attributed in part to the negative effects of social media. The Schott family has since blamed platforms like Meta and YouTube for harming children's mental health through addictive design. The article raises the question of whether legal or regulatory actions against these companies could mark a turning point for Big Tech, similar to the tobacco industry's past reckoning. The focus is on potential consequences for tech companies if they are held accountable for youth harm.
Caleb Cain's Radicalization via YouTube's Algorithm
A 26-year-old man from West Virginia, Caleb Cain, was radicalized by far-right content on YouTube over several years. He described how the platform's recommendation algorithm exposed him to extremist ideologies, including white supremacy and anti-feminism. The incident highlights concerns about algorithmic amplification of harmful content on YouTube.
Russia's Internet Research Agency targets U.S. with social media disinformation during 2016 election
The Senate Intelligence Committee revealed that Russia's Internet Research Agency used social media platforms including Facebook, Instagram, and Twitter to target African Americans and spread disinformation aimed at sowing racial discord during the 2016 U.S. election. The agency's content was heavily focused on race-related themes. This incident highlights foreign interference through digital platforms during a critical U.S. political event.