All actors
CompanyChinaEst. 2012Website

ByteDance

ByteDance has been named in 4 documented digital harm incidents, including 1 fatality and 3 involving minors. The most common harm domain is Addiction & Mental Health, followed by Child Safety.

4
Incidents
1
Fatalities
3
Minors involved
Financial harm

Documented Incidents

4
Jan 1, 2023

17-year-old Missouri girl dies by suicide after Snapchat and TikTok addiction beginning at age 10 leads to severe depression and self-harm

A wrongful death case filed in the Social Media Adolescent Addiction MDL alleges that a 17-year-old girl from Missouri became addicted to Snapchat and TikTok starting around age 10 or 11. The lawsuit claims the addiction led to severe mental depression, escalating to self-harm and ultimately to her death by suicide. The case is part of the broader MDL consolidating thousands of personal injury and wrongful death claims against Meta, TikTok, Snap, and YouTube over algorithmic design features alleged to foster addiction in minors.

Addiction & Mental HealthAddictionFatalityMinor
Jan 24, 2022

K.S. hospitalized with heart failure after TikTok algorithm repeatedly pushed anorexia content to 13-year-old

K.S., a 13-year-old girl from Virginia, was hospitalized on January 24, 2022 after developing a severe eating disorder driven by TikTok's recommendation algorithm. Her mother had searched for healthy recipes and fitness content, but TikTok's algorithm began flooding K.S.'s feed with eating disorder-promoting videos. K.S. was admitted to hospital with a resting heart rate of 40-44 beats per minute — critically below the normal range of 60-100 — and underwent a 16-day re-feeding program. Her mother repeatedly deleted TikTok from K.S.'s phone only to find it reinstalled. Screenshots showed TikTok pushing eating disorder content to the minor without any search or input from the user. K.S. and her parents filed a personal injury lawsuit against TikTok and ByteDance in Los Angeles in 2022. The case is part of the Social Media Adolescent Addiction/Personal Injury Products Liability MDL.

Addiction & Mental HealthEating DisorderMinor
Jul 1, 2021

Two young girls die after TikTok algorithm promotes blackout challenge to their feeds

The parents of two young girls who died after participating in the 'blackout challenge' on TikTok have filed a lawsuit against the platform. The challenge, which involves choking oneself until passing out, was allegedly promoted by TikTok's algorithm to the children's 'For You' pages. The lawsuit claims that TikTok failed to warn users about the dangers and intentionally pushed harmful content. The children, Lalani Walton and Arriani Arroyo, were described as outgoing and active, with Lalani aspiring to be a rapper and Arriani enjoying sports and dance. TikTok has denied responsibility, stating the challenge predates the platform and that they would remove related content if found.

Child SafetyDangerous Challenge
Jan 1, 2020·Hartford, Connecticut

Caroline Koziol develops anorexia after TikTok and Instagram algorithm floods feed with extreme dieting content, joins landmark MDL

Caroline Koziol of Hartford, Connecticut began using Instagram and TikTok during the COVID-19 pandemic to search for at-home workouts and healthy recipes to support her swimming training. Within weeks, both platforms' recommendation algorithms had flooded her feeds with content promoting extreme workouts and disordered eating. 'One innocent search turned into this avalanche,' she said. Koziol, now 21, developed anorexia and is among more than 1,800 plaintiffs in the Social Media Adolescent Addiction/Personal Injury Products Liability MDL suing Meta and TikTok. She is not suing over specific content but over the platforms' defective recommendation design that maximized her engagement and drove her deeper into eating disorder content.

Addiction & Mental HealthEating DisorderMinor

Linked Legislation

14
HB 5532 — Establishes The Stop Addictive Feeds Exploitation (Safe) For Kids Act Prohibiting The Provision Of Addictive Feeds To Minors By Addictive Social Media Platforms
West Virginia
H 823 — An Act Relating To Social Media Warning Labels
Vermont
HB 1624 — Consumer Data Protection Act; Social Media Platforms; Addictive Feed Prohibited For Minors
Virginia
S 6418 — Relates to the regulation of social media companies and social media platforms
New York
AB 2246 — Youth Social Media Protection Act: Report
California
AB 960 — Relating To: Requiring Social Media Platforms To Provide Mental Health Warnings And Providing A Penalty
Wisconsin
SB 933 — Relating To: Requiring Social Media Platforms To Provide Mental Health Warnings And Providing A Penalty
Wisconsin
SB 1345 — Commercial Entity Offering Social Media Accounts; Restricted Hours For Minors, Civil Liability
Virginia
SB 532 — Commercial Entity Offering Social Media Accounts; Restricted Hours For Minors, Civil Liability
Virginia
HB 2294 — Virginia Social Media Regulation Act
Virginia
HB 562 — Commercial Entity Offering Social Media Accounts; Restricted Hours For Minors, Civil Liability
Virginia
H 4591 — Stop Harm from Addictive Social Media
South Carolina
H 4700 — South Carolina Social Media Regulation Act
South Carolina
H 3431 — South Carolina Social Media Regulation Act
South Carolina

By Harm Domain

Addiction & Mental Health3
Child Safety1