ByteDance
ByteDance has been named in 4 documented digital harm incidents, including 1 fatality and 3 involving minors. The most common harm domain is Addiction & Mental Health, followed by Child Safety.
Documented Incidents
417-year-old Missouri girl dies by suicide after Snapchat and TikTok addiction beginning at age 10 leads to severe depression and self-harm
A wrongful death case filed in the Social Media Adolescent Addiction MDL alleges that a 17-year-old girl from Missouri became addicted to Snapchat and TikTok starting around age 10 or 11. The lawsuit claims the addiction led to severe mental depression, escalating to self-harm and ultimately to her death by suicide. The case is part of the broader MDL consolidating thousands of personal injury and wrongful death claims against Meta, TikTok, Snap, and YouTube over algorithmic design features alleged to foster addiction in minors.
K.S. hospitalized with heart failure after TikTok algorithm repeatedly pushed anorexia content to 13-year-old
K.S., a 13-year-old girl from Virginia, was hospitalized on January 24, 2022 after developing a severe eating disorder driven by TikTok's recommendation algorithm. Her mother had searched for healthy recipes and fitness content, but TikTok's algorithm began flooding K.S.'s feed with eating disorder-promoting videos. K.S. was admitted to hospital with a resting heart rate of 40-44 beats per minute — critically below the normal range of 60-100 — and underwent a 16-day re-feeding program. Her mother repeatedly deleted TikTok from K.S.'s phone only to find it reinstalled. Screenshots showed TikTok pushing eating disorder content to the minor without any search or input from the user. K.S. and her parents filed a personal injury lawsuit against TikTok and ByteDance in Los Angeles in 2022. The case is part of the Social Media Adolescent Addiction/Personal Injury Products Liability MDL.
Two young girls die after TikTok algorithm promotes blackout challenge to their feeds
The parents of two young girls who died after participating in the 'blackout challenge' on TikTok have filed a lawsuit against the platform. The challenge, which involves choking oneself until passing out, was allegedly promoted by TikTok's algorithm to the children's 'For You' pages. The lawsuit claims that TikTok failed to warn users about the dangers and intentionally pushed harmful content. The children, Lalani Walton and Arriani Arroyo, were described as outgoing and active, with Lalani aspiring to be a rapper and Arriani enjoying sports and dance. TikTok has denied responsibility, stating the challenge predates the platform and that they would remove related content if found.
Caroline Koziol develops anorexia after TikTok and Instagram algorithm floods feed with extreme dieting content, joins landmark MDL
Caroline Koziol of Hartford, Connecticut began using Instagram and TikTok during the COVID-19 pandemic to search for at-home workouts and healthy recipes to support her swimming training. Within weeks, both platforms' recommendation algorithms had flooded her feeds with content promoting extreme workouts and disordered eating. 'One innocent search turned into this avalanche,' she said. Koziol, now 21, developed anorexia and is among more than 1,800 plaintiffs in the Social Media Adolescent Addiction/Personal Injury Products Liability MDL suing Meta and TikTok. She is not suing over specific content but over the platforms' defective recommendation design that maximized her engagement and drove her deeper into eating disorder content.