TikTok
TikTok has been named in 22 documented digital harm incidents, including 4 fatalities and 15 involving minors. The most common harm domain is Addiction & Mental Health, followed by Child Safety.
Documented Incidents
22Teenagers engage in door-kicking prank as part of TikTok challenge
A TikTok challenge involving teens kicking front doors at homes led to incidents in the Milton neighborhood of Santa Rosa County, Florida. The sheriff's office reported several attempted break-ins over the weekend, with one home sustaining thousands of dollars in damage. Chief Deputy Randy Tifft warned that the prank is dangerous and could lead to violent misunderstandings, as homeowners might mistake the kicks for a break-in. Congressman Jimmy Patronis criticized social media's influence and called for the repeal of Section 230 to hold tech companies accountable for harmful content. In Okaloosa County, four teens previously involved in a similar prank faced criminal charges, with three charged with misdemeanors and one with a felony. Authorities in Santa Rosa County said any teens identified in the recent incidents could face jail time and restitution for damages.
20-year-old woman awarded $4.2 million after Meta and YouTube found liable for mental health harm via addictive platform design
On March 25, juries in Los Angeles, California, ruled that Meta and YouTube were liable for negligence in a case involving youth addiction and mental health. The plaintiff, a now 20-year-old woman known as Kaley G.M., claimed she became addicted to Instagram and YouTube during grade school, which contributed to her anxiety and depression. Meta was ordered to pay $4.2 million in damages, and YouTube was ordered to pay $1.8 million. The case is significant because it challenges Section 230 of the Communications Decency Act, which has previously shielded social media companies from liability. The ruling sets a legal precedent by suggesting that social media platforms can be held responsible for personal injury caused by their product design. Meta has stated it is considering an appeal.
Florida opens investigation into Discord over child safety failures and predator access
Florida is investigating the Discord app over child safety concerns, following reports of abductions and grooming. The investigation, led by Florida Attorney General James Uthmeier, claims the app puts children at risk by allowing predators to access young users. Discord is marketed as a communication platform for young people, similar to Facebook or Instagram, and is used by millions, including Gen Z users for gaming and social interaction. The state has issued subpoenas for marketing and promotional documents related to Discord, as well as other platforms like TikTok and Roblox. A 2022 safety message from Discord states the app includes tools to help users avoid inappropriate content or unwanted contact. The investigation is part of a broader push by Florida to address online safety risks for children.
'I'm caught, ain't I?': Man drives to NY to rape girl he met on TikTok, deputies say
A man drove from out of state to Binghamton, New York, to sexually assault a girl he had groomed on TikTok. Deputies reported he said 'I'm caught, ain't I?' upon arrest. The case highlighted how TikTok's platform was used to facilitate contact between adult predators and minors.
TikTok influencers spread false claims about luxury brands Hermès, Louis Vuitton, and Chanel
In early 2025, TikTok influencers posted viral videos falsely alleging that luxury houses Hermès, Louis Vuitton and Chanel produced their goods in Chinese factories while being marketed as "Made in France" or "Made in Italy." The misinformation sparked widespread debate and reputational harm for the brands. A study by Cardiff Business School found that influencer‑driven misinformation is more toxic and reaches larger audiences than content from ordinary users, due to strong parasocial bonds with followers. The research highlights the amplified risk of misinformation when spread by high‑profile social‑media personalities.
Texas family warns of blackout challenge after child dies attempting TikTok-spread stunt
A 9-year-old girl from Stephenville, Texas, JackLynn Blackwell, died on February 3, 2026, after apparently attempting the "blackout challenge," a dangerous social media dare involving self-choking. Her parents believe she saw a video of the challenge on YouTube and tried to replicate it. The Centers for Disease Control and Prevention (CDC) reports that 80 people have died from this challenge, also known as the "choking game" or "pass-out challenge." The Blackwell family is raising awareness about the risks of viral social media challenges and the role of algorithmic content recommendations in exposing children to harmful content. In Delaware, six families have sued TikTok over similar incidents, and the Blackwells hope their case will lead to increased platform accountability.
TikTok settles landmark social media addiction lawsuit filed by minor plaintiff
TikTok has settled a landmark lawsuit related to social media addiction before the trial began. The case was brought by a French teenager who claimed the platform's addictive design harmed his mental health. The lawsuit was filed in a French court and marked one of the first legal challenges of its kind in Europe. The settlement terms were not disclosed, but the case could influence future legal actions against social media platforms. The incident highlights growing concerns over the mental health impacts of social media use, particularly among minors.
Chinese "Spamouflage" Influence Operation Uses Fake U.S. Voter Personas
Researchers at Graphika identified a Chinese state‑linked influence campaign, dubbed “Spamouflage,” that created a network of fake social‑media accounts impersonating U.S. voters, soldiers and a news outlet. The operation posted divisive content on X, TikTok, YouTube, Instagram and Facebook ahead of the 2024 presidential election, targeting topics such as reproductive rights, homelessness, Ukraine and Israel. Meta linked the network to Chinese law‑enforcement, while TikTok removed one of the accounts for policy violations after a video mocking President Biden amassed 1.5 million views. The campaign illustrates China’s use of deceptive online behavior to portray the United States as politically unstable.
Elderly victims defrauded by AI voice cloning virtual kidnapping scams across the United States
In April 2023, an Arizona woman named Jennifer DeStefano received a call from an anonymous caller who claimed to have kidnapped her 15-year-old daughter and demanded a $1 million ransom. The caller played a deepfake audio of a child in distress, which was later identified as part of a virtual kidnapping scam. The scammer reduced the ransom to $50,000 during negotiations, but DeStefano discovered her daughter was safe and reported the incident to the police. Virtual kidnapping involves cybercriminals using AI voice cloning tools and social engineering to manipulate victims into paying ransoms by creating the illusion of a kidnapping. The FBI and Federal Trade Commission have warned about the increasing use of deepfake technology in scams, with impostor scams causing $2.6 billion in losses in 2022. These attacks often target parents by exploiting publicly available biometric data from social media platforms to create convincing audio evidence.
17-year-old Missouri girl dies by suicide after Snapchat and TikTok addiction beginning at age 10 leads to severe depression and self-harm
A wrongful death case filed in the Social Media Adolescent Addiction MDL alleges that a 17-year-old girl from Missouri became addicted to Snapchat and TikTok starting around age 10 or 11. The lawsuit claims the addiction led to severe mental depression, escalating to self-harm and ultimately to her death by suicide. The case is part of the broader MDL consolidating thousands of personal injury and wrongful death claims against Meta, TikTok, Snap, and YouTube over algorithmic design features alleged to foster addiction in minors.
Teen Mental Health Crisis Linked to Social Media Platforms
A national CDC survey found that nearly 30% of teenage girls considered suicide, with many reporting persistent sadness or hopelessness. Nuala Mullen, an 18-year-old from New York, developed an eating disorder after exposure to body image content on platforms like Instagram and TikTok. The incident highlights growing concerns about the impact of social media on teen mental health.
Did TikTok videos inspire a teen’s suicide? His mom says she found graphic evidence - NBC News
A 16-year-old teen, Mason, died by suicide after his family said he was exposed to graphic TikTok videos that depicted self-harm and suicide methods. In the months before his death, Mason was struggling with a breakup and increased anxiety. Despite his family's efforts to seek help, they were unaware of the content he was consuming online. Mason liked and engaged with videos that included detailed descriptions of suicide, including one referencing rapper Lil Loaded, who had also died by suicide. His stepbrother reported that Mason joked about 'pulling a Lil Loaded' before his death. Mason's mother found his phone and noticed the disturbing content, but it appears TikTok's automated moderation system failed to detect and remove the videos.
Sixteen-year-old Arkansas teen dies by suicide after TikTok algorithm surfaces graphic self-harm content
A 16-year-old teen, Mason, died by suicide after his family said he was exposed to graphic TikTok videos that depicted self-harm and suicide methods. In the months before his death, Mason was struggling with a breakup and increased anxiety. Despite his family's efforts to seek help, they were unaware of the content he was consuming online. Mason liked and engaged with videos that included detailed descriptions of suicide, including one referencing rapper Lil Loaded, who had also died by suicide. His stepbrother reported that Mason joked about 'pulling a Lil Loaded' before his death. Mason's mother found his phone and noticed the disturbing content, but it appears TikTok's automated moderation system failed to detect and remove the videos.
self_harm_suicide: 14‑year‑old boy found dead after exposure to harmful social media content — TikTok
Ellen Roome discovered her 14‑year‑old son Jools dead in his Cheltenham bedroom in April 2022, an hour after he was seen laughing with a friend on CCTV. A 23‑minute inquest found no evidence of suicide, but Roome believes Jools was exposed to harmful content on social media in his final hours. She has campaigned for legal accountability and influenced UK legislation (Jools' Law) to preserve the data of deceased children, and is suing TikTok in the US to access his social media history.
K.S. hospitalized with heart failure after TikTok algorithm repeatedly pushed anorexia content to 13-year-old
K.S., a 13-year-old girl from Virginia, was hospitalized on January 24, 2022 after developing a severe eating disorder driven by TikTok's recommendation algorithm. Her mother had searched for healthy recipes and fitness content, but TikTok's algorithm began flooding K.S.'s feed with eating disorder-promoting videos. K.S. was admitted to hospital with a resting heart rate of 40-44 beats per minute — critically below the normal range of 60-100 — and underwent a 16-day re-feeding program. Her mother repeatedly deleted TikTok from K.S.'s phone only to find it reinstalled. Screenshots showed TikTok pushing eating disorder content to the minor without any search or input from the user. K.S. and her parents filed a personal injury lawsuit against TikTok and ByteDance in Los Angeles in 2022. The case is part of the Social Media Adolescent Addiction/Personal Injury Products Liability MDL.
French teenager dies by suicide after TikTok algorithm drives her into harmful content rabbit hole
Amnesty International France found that TikTok's algorithm continues to expose young users to harmful content related to depression and suicide through a 'rabbit hole' effect. Using test accounts, the organization observed that after searching for mental health content, users were shown increasingly depressive and romanticized suicide content. The report highlights cases like 15-year-old Marie Le Tiec, who died by suicide after viewing harmful TikTok content. TikTok denies responsibility, citing its moderation efforts and referral to mental health resources. Experts are divided on whether such content directly causes self-harm or primarily affects vulnerable users.
Over 2,000 families sue Meta, TikTok, Snapchat, and YouTube over children's mental health harms
More than 2,000 families are suing social media companies including TikTok, Snapchat, YouTube, Roblox, and Meta (parent company of Instagram and Facebook) over the impact of social media on children's mental health. The lawsuits allege that platforms like Instagram contributed to the development of depression and eating disorders in minors. One case involves the Spence family from Long Island, New York, whose daughter Alexis developed an eating disorder at age 12 after using Instagram, which she accessed by falsely checking a 13+ age box. Alexis reported that Instagram's algorithm led her to pro-anorexia content, which normalized disordered eating behaviors and worsened her mental health. The lawsuits are expected to move forward in 2024, with over 350 cases anticipated to proceed.
Two young girls die after TikTok algorithm promotes blackout challenge to their feeds
The parents of two young girls who died after participating in the 'blackout challenge' on TikTok have filed a lawsuit against the platform. The challenge, which involves choking oneself until passing out, was allegedly promoted by TikTok's algorithm to the children's 'For You' pages. The lawsuit claims that TikTok failed to warn users about the dangers and intentionally pushed harmful content. The children, Lalani Walton and Arriani Arroyo, were described as outgoing and active, with Lalani aspiring to be a rapper and Arriani enjoying sports and dance. TikTok has denied responsibility, stating the challenge predates the platform and that they would remove related content if found.
TikTok agrees to $92 million settlement in class action privacy lawsuit - Mashable
TikTok's parent company, ByteDance, agreed to a $92 million settlement in a class action lawsuit related to privacy violations under Illinois' biometric privacy laws. The lawsuit combined 21 separate cases into one filed in the U.S. District Court for the Northern District of Illinois. Plaintiffs claimed TikTok and its predecessor, Musical.ly, collected biometric data and used facial recognition technology in video filters to track users, including minors. The settlement requires TikTok to disclose data collection practices in its Privacy Policy and provide employee training on data privacy laws. The settlement is pending judicial approval and includes U.S. users who used TikTok or Musical.ly before February 25.
Woman whose son died from drugs bought on social media celebrates verdicts against Meta ...
A Colorado woman, Kimberly Osterman, celebrated recent verdicts against Meta and YouTube, which were found liable for harms to children due to platform design. Her son, Max Osterman, died in 2021 at age 18 after purchasing a fentanyl-laced pill through Snapchat. In Los Angeles, a jury ruled that Meta and YouTube designed their platforms to hook young users, and in New Mexico, Meta was found to have knowingly harmed children’s mental health and concealed information about child sexual exploitation. Snap Inc., the parent company of Snapchat, and TikTok settled before the Los Angeles trial began. Osterman is part of Parents for Safe Online Spaces, advocating for the Kids Online Safety Act, which would require social media platforms to take steps to prevent harm to minors. The drug dealer who sold Max the pill was sentenced to six years in prison in 2023.
Caroline Koziol develops anorexia after TikTok and Instagram algorithm floods feed with extreme dieting content, joins landmark MDL
Caroline Koziol of Hartford, Connecticut began using Instagram and TikTok during the COVID-19 pandemic to search for at-home workouts and healthy recipes to support her swimming training. Within weeks, both platforms' recommendation algorithms had flooded her feeds with content promoting extreme workouts and disordered eating. 'One innocent search turned into this avalanche,' she said. Koziol, now 21, developed anorexia and is among more than 1,800 plaintiffs in the Social Media Adolescent Addiction/Personal Injury Products Liability MDL suing Meta and TikTok. She is not suing over specific content but over the platforms' defective recommendation design that maximized her engagement and drove her deeper into eating disorder content.
KGM sues Meta and Google over Instagram and YouTube addiction beginning at age 6, leading to depression and suicidal thoughts — first bellwether trial
A woman identified as KGM (Kaley G.M.) filed one of the first bellwether cases in the Social Media Adolescent Addiction MDL, alleging that Instagram and YouTube addiction beginning when she was approximately 6 years old led to clinical depression and suicidal thoughts. The lawsuit names Meta, Google, TikTok, and Snapchat, with Snap settling before trial. In January and February 2026, KGM's case became the first social media addiction case to proceed to jury trial in Los Angeles, with her mother Karen Glenn also testifying. Expert witnesses including Stanford psychiatry professor Anna Lembke testified that social media addiction is real and can cause or worsen anxiety, depression, and suicidal thoughts. The trial's outcome is expected to influence over 1,000 similar lawsuits.