TikTok
TikTok has been named in 33 documented digital harm incidents, including 5 fatalities and 18 involving minors. The most common harm domain is Self-Harm & Suicide, followed by Addiction & Mental Health.
Documented Incidents
33Teenage boys cause facial injuries attempting jawline modification via looksmaxxing trend on social media
A dangerous trend known as "looksmaxxing" has gained traction on social media, with young boys as young as 10 reportedly using hammers to reshape their jawlines in pursuit of an idealized appearance. The trend is associated with Braden Eric Peters, known online as Clavicular, who has over one million followers and promotes extreme measures such as steroid use, self-injection, and crystal meth to enhance appearance. Clavicular was recently arrested on a battery charge and has a history of self-harm and risky behavior, including being expelled from school for possessing testosterone. The trend has been linked to severe psychological effects, including self-harm and suicidal ideation, with one teenager reportedly saying he would take his own life if he did not reach a certain height. The movement, which began in the 2010s, has expanded beyond online forums to platforms like TikTok and Instagram, where influencers share before-and-after transformations, encouraging others to take similar risks. Experts warn that looksmaxxing can lead to serious emotional and physical consequences, including eating disorders, depression, and loss of self-esteem.
Teenagers engage in door-kicking prank as part of TikTok challenge
A TikTok challenge involving teens kicking front doors at homes led to incidents in the Milton neighborhood of Santa Rosa County, Florida. The sheriff's office reported several attempted break-ins over the weekend, with one home sustaining thousands of dollars in damage. Chief Deputy Randy Tifft warned that the prank is dangerous and could lead to violent misunderstandings, as homeowners might mistake the kicks for a break-in. Congressman Jimmy Patronis criticized social media's influence and called for the repeal of Section 230 to hold tech companies accountable for harmful content. In Okaloosa County, four teens previously involved in a similar prank faced criminal charges, with three charged with misdemeanors and one with a felony. Authorities in Santa Rosa County said any teens identified in the recent incidents could face jail time and restitution for damages.
20-year-old woman awarded $4.2 million after Meta and YouTube found liable for mental health harm via addictive platform design
On March 25, juries in Los Angeles, California, ruled that Meta and YouTube were liable for negligence in a case involving youth addiction and mental health. The plaintiff, a now 20-year-old woman known as Kaley G.M., claimed she became addicted to Instagram and YouTube during grade school, which contributed to her anxiety and depression. Meta was ordered to pay $4.2 million in damages, and YouTube was ordered to pay $1.8 million. The case is significant because it challenges Section 230 of the Communications Decency Act, which has previously shielded social media companies from liability. The ruling sets a legal precedent by suggesting that social media platforms can be held responsible for personal injury caused by their product design. Meta has stated it is considering an appeal.
Teenagers turn to AI chatbots for dieting advice, receiving harmful weight loss recommendations
Teens in Memphis, Tennessee, are increasingly using artificial intelligence for dieting and weight loss advice, according to a report by FOX13 Memphis. Parents and medical professionals, including pediatrician Dr. Michelle Bowden, have expressed concerns about the accuracy and safety of AI-generated health advice for adolescents. Dr. Bowden noted that AI often pulls information from unreliable sources, such as blogs without medical credentials, and may provide inappropriate calorie recommendations that can lead to malnourishment. The report highlights that some teens following AI-generated diet plans have experienced health issues like low blood sugar, slow digestion, and, in severe cases, hospitalization due to dangerously low heart rates. Le Bonheur Children’s Hospital has seen an increase in patients using AI for meal planning and calorie tracking, with some developing eating disorders like anorexia. Experts emphasize the importance of personalized medical advice over online tools.
Florida opens investigation into Discord over child safety failures and predator access
Florida is investigating the Discord app over child safety concerns, following reports of abductions and grooming. The investigation, led by Florida Attorney General James Uthmeier, claims the app puts children at risk by allowing predators to access young users. Discord is marketed as a communication platform for young people, similar to Facebook or Instagram, and is used by millions, including Gen Z users for gaming and social interaction. The state has issued subpoenas for marketing and promotional documents related to Discord, as well as other platforms like TikTok and Roblox. A 2022 safety message from Discord states the app includes tools to help users avoid inappropriate content or unwanted contact. The investigation is part of a broader push by Florida to address online safety risks for children.
'I'm caught, ain't I?': Man drives to NY to rape girl he met on TikTok, deputies say
A man drove from out of state to Binghamton, New York, to sexually assault a girl he had groomed on TikTok. Deputies reported he said 'I'm caught, ain't I?' upon arrest. The case highlighted how TikTok's platform was used to facilitate contact between adult predators and minors.
TikTok influencers spread false claims about luxury brands Hermès, Louis Vuitton, and Chanel
In early 2025, TikTok influencers posted viral videos falsely alleging that luxury houses Hermès, Louis Vuitton and Chanel produced their goods in Chinese factories while being marketed as "Made in France" or "Made in Italy." The misinformation sparked widespread debate and reputational harm for the brands. A study by Cardiff Business School found that influencer‑driven misinformation is more toxic and reaches larger audiences than content from ordinary users, due to strong parasocial bonds with followers. The research highlights the amplified risk of misinformation when spread by high‑profile social‑media personalities.
Teenage girl dies by suicide following sustained cyberbullying on gossip platform
Sophie-May Dickson, a social media influencer, faced backlash after sharing videos from her 16-year-old daughter Princess's funeral in February 2024. Princess died by suicide after years of online bullying, particularly on the gossip site Tattle Life, where she was targeted for her appearance from the age of 14. The abuse initially focused on Sophie-May but shifted to Princess after Sophie-May deleted some of her social media accounts. At the funeral, trolls left cruel comments on Sophie-May's Instagram post, accusing her of seeking attention. Sophie-May responded by explaining that sharing the moment was personal and not for views, and that she hired photographers to capture the event due to the emotional intensity. Tattle Life, described as a "troll's paradise," allowed anonymous users to post offensive remarks about Princess even after her death. Princess's suicide and the ongoing online abuse have highlighted the severe impact of cyberbullying on vulnerable teenagers.
AI-generated deepfake videos spread political disinformation in Bangladesh without platform intervention
AI-generated videos are spreading disinformation online in Bangladesh ahead of the 13th national election. A video featuring a woman resembling Rikta, a garment worker who lost her arm in the 2013 Rana Plaza collapse, falsely accused a political party of fraud and was shared over 21,000 times on the Uttarbanga Television Facebook page. The video, uploaded on 10 January, was identified as AI-generated after fact-checking by Prothom Alo. The Representation of the People Order prohibits the use of AI to create misleading content during elections, but such content continues to circulate. The Bangladesh Army issued a warning on 14 January about AI-generated videos misrepresenting military personnel, but the videos remain online. Authorities have yet to take action, despite the potential for such content to incite violence or confusion among voters.
Victims across the US defrauded by AI voice cloning scams impersonating family members
Patty Greiner lost $15,000 after receiving a text claiming her Amazon account was hacked and later being contacted by individuals impersonating IRS agents and law enforcement. Scammers are using AI to clone voices by extracting personal information from social media platforms like TikTok, Instagram, and Facebook. Cybersecurity expert Dave Hatter demonstrated how easily a voice can be cloned using free software, warning that this could lead to a surge in crime. Impersonators range from individuals to organized criminal gangs and nation-state actors from countries like China, Russia, and Iran. Experts advise not to use links or numbers provided by suspicious callers and to verify the legitimacy of requests directly with the organization or person involved.
Spain opens investigation into X, Meta, and TikTok over AI-generated child sexual abuse material
Spain has launched an investigation into X, Meta, and TikTok for their involvement in the distribution of AI-generated child sexual abuse material. The probe focuses on the platforms' handling of such content. The investigation is part of broader efforts to address digital harms and protect children online. The companies are being scrutinized for their policies and responses to AI-generated abuse material. The investigation is ongoing, with potential consequences including regulatory action or legal penalties.
WhatsApp removes 6.8 million accounts linked to pig butchering scams spreading via ChatGPT and Telegram
WhatsApp deleted over 6.8 million accounts linked to pig butchering scams, a type of fraud that combines romance and investment schemes. Scammers used AI tools like ChatGPT to craft initial messages and then shifted conversations to Telegram to carry out the fraud. These scams often involve building trust with victims before defrauding them, typically through fake investment platforms. A recent study found that crypto scams have caused over $60 billion in reported losses, with fraudulent trading platforms being the most common. Scammers also used tactics like asking victims to complete small tasks on social media before requesting real money deposits into crypto accounts. Experts warn that coordinated efforts among banks, regulators, and tech platforms are needed to combat this growing threat.
TikTok settles landmark social media addiction lawsuit filed by minor plaintiff
TikTok has settled a landmark lawsuit related to social media addiction before the trial began. The case was brought by a French teenager who claimed the platform's addictive design harmed his mental health. The lawsuit was filed in a French court and marked one of the first legal challenges of its kind in Europe. The settlement terms were not disclosed, but the case could influence future legal actions against social media platforms. The incident highlights growing concerns over the mental health impacts of social media use, particularly among minors.
Seniors and professionals defrauded by AI voice cloning scams impersonating family members across Europe
A scam involving artificial intelligence (AI) impersonated the voice of a senior member of US President Donald Trump’s administration, specifically US Secretary of State Marco Rubio. The imposter contacted three foreign ministers, a US governor, and a member of Congress via the encrypted app Signal. US authorities believe the scam aimed to manipulate officials to gain access to information or accounts, though the perpetrator remains unidentified. Scammers use AI to clone voices from short audio samples, often sourced from social media platforms like TikTok, to create realistic voice replicas. The FBI warns of “smishing” and “vishing” scams, which use voice or text messages with links to deceive victims. To avoid such scams, cybersecurity experts recommend verifying unexpected calls, being cautious of caller ID spoofing, and using secret words or phrases to confirm identities.
Young girls exploited to carry drugs by county lines gangs using social media for recruitment
County lines exploitation has increasingly targeted young girls, with Georgina*, a 17-year-old, being coerced into drug transportation after being groomed online by an older man connected to criminal gangs. Grooming often begins with compliments, gifts, and social media contact before moving to encrypted apps, with perpetrators using psychological and sexual coercion to control victims. In 2026, charities reported a rise in the exploitation of girls, who are less likely to be stopped or suspected of criminal activity. A Freedom of Information request revealed children as young as 13 had been arrested for drug dealing, and two girls were jailed for murder in a county lines-related attack in January 2026. Catch 22 reported that 22% of their county lines rescue service clients in 2025 were female, and the charity noted a lack of National Referral Mechanisms for girls, despite many being victims of criminal exploitation. In response, Essex Police launched the Under The Radar project to support girls aged 11–24 at risk of exploitation.
Chinese "Spamouflage" Influence Operation Uses Fake U.S. Voter Personas
Researchers at Graphika identified a Chinese state‑linked influence campaign, dubbed “Spamouflage,” that created a network of fake social‑media accounts impersonating U.S. voters, soldiers and a news outlet. The operation posted divisive content on X, TikTok, YouTube, Instagram and Facebook ahead of the 2024 presidential election, targeting topics such as reproductive rights, homelessness, Ukraine and Israel. Meta linked the network to Chinese law‑enforcement, while TikTok removed one of the accounts for policy violations after a video mocking President Biden amassed 1.5 million views. The campaign illustrates China’s use of deceptive online behavior to portray the United States as politically unstable.
Chinese Spamouflage campaign targets Canadian officials and Chinese‑Canadian community
Rapid Response Mechanism Canada identified a new transnational repression operation, dubbed “Spamouflage,” that began on August 31 2024. The campaign uses hundreds of bot‑like accounts on X, Facebook, TikTok and YouTube to post deep‑fake videos, sexually explicit AI‑generated images, and doxxing material aimed at ten Mandarin‑speaking Chinese‑Canadian individuals as well as Canadian government officials, media outlets and the Canadian Armed Forces. The deepfakes falsely accuse Prime Minister Justin Trudeau, Minister Mélanie Joly and other officials of corruption and sexual scandals. Researchers attribute the coordinated inauthentic activity with high confidence to actors linked to the People’s Republic of China.
From viral fame to tragedy: Deaths linked to TikTok challenges, algorithms and creator culture
The death of Rachel Tussey, an Ohio mother of three who documented her cosmetic surgery journey on TikTok, has drawn renewed attention to the platform's role in digital harms. TikTok has been linked to several deaths through viral challenges and content that may encourage risky or harmful behavior. The platform's algorithms and creator culture are under scrutiny for potentially amplifying content that could lead to self-harm or suicide. Tussey's case is part of a broader conversation about the impact of social media on mental health and safety. The incident highlights concerns about how platforms like TikTok manage content that could pose risks to users.
Aubreigh Wyatt dies by suicide after sustained cyberbullying campaign amplified by TikTok
Aubreigh Wyatt, a teenager from Mississippi, died by suicide in September 2023. Her mother, Heather Wyatt, began speaking publicly about her daughter's death and the alleged bullying she faced, which led to a viral social media campaign. In response, the parents of four girls accused of bullying Aubreigh filed a lawsuit, forcing Heather to remove related social media posts. The lawsuit also included claims against social media companies for their role in cyberbullying. An initial police investigation into Aubreigh's death stalled due to the lack of testimony from Aubreigh. Heather has since shifted her focus to mental health awareness, emphasizing her desire to avoid fostering hate or conflict.
South Carolina teen nearly dies after participating in TikTok Benadryl challenge
A teenager in South Carolina nearly died after participating in a TikTok trend involving taking large amounts of Benadryl. The teen's mother discovered her daughter had taken a dangerous amount of the medication after noticing strange behavior and finding pills under her pillow. At the hospital, doctors questioned if it was a suicide attempt, but the teen claimed a friend suggested taking Benadryl to get high. The mother found TikTok videos promoting the trend on her daughter's phone and attempted to report them but learned they did not violate TikTok's guidelines. The dangerous challenge has been linked to at least one death in 2023.
75-year-old Singaporean woman loses $600,000 to AI deepfake scam impersonating Elon Musk over three years
A 75-year-old Singaporean woman lost $600,000 over three years to an AI-driven scam in which fraudsters impersonated high-profile figures like Elon Musk. The scammers used AI-generated content on platforms such as TikTok and WhatsApp to manipulate her into making repeated financial transfers. The scam was discovered in April 2025 when her daughters were alerted by police about a suspicious $67,000 transfer attempt. The victim was later diagnosed with psychosis due to the psychological impact of the scam. In response, Singapore introduced the Protection from Scams Act 2025, including Restriction Orders to block suspicious transactions and protect potential victims.
17-year-old Missouri girl dies by suicide after Snapchat and TikTok addiction beginning at age 10 leads to severe depression and self-harm
A wrongful death case filed in the Social Media Adolescent Addiction MDL alleges that a 17-year-old girl from Missouri became addicted to Snapchat and TikTok starting around age 10 or 11. The lawsuit claims the addiction led to severe mental depression, escalating to self-harm and ultimately to her death by suicide. The case is part of the broader MDL consolidating thousands of personal injury and wrongful death claims against Meta, TikTok, Snap, and YouTube over algorithmic design features alleged to foster addiction in minors.
Teen Mental Health Crisis Linked to Social Media Platforms
A national CDC survey found that nearly 30% of teenage girls considered suicide, with many reporting persistent sadness or hopelessness. Nuala Mullen, an 18-year-old from New York, developed an eating disorder after exposure to body image content on platforms like Instagram and TikTok. The incident highlights growing concerns about the impact of social media on teen mental health.
Did TikTok videos inspire a teen’s suicide? His mom says she found graphic evidence - NBC News
A 16-year-old teen, Mason, died by suicide after his family said he was exposed to graphic TikTok videos that depicted self-harm and suicide methods. In the months before his death, Mason was struggling with a breakup and increased anxiety. Despite his family's efforts to seek help, they were unaware of the content he was consuming online. Mason liked and engaged with videos that included detailed descriptions of suicide, including one referencing rapper Lil Loaded, who had also died by suicide. His stepbrother reported that Mason joked about 'pulling a Lil Loaded' before his death. Mason's mother found his phone and noticed the disturbing content, but it appears TikTok's automated moderation system failed to detect and remove the videos.
Sixteen-year-old Arkansas teen dies by suicide after TikTok algorithm surfaces graphic self-harm content
A 16-year-old teen, Mason, died by suicide after his family said he was exposed to graphic TikTok videos that depicted self-harm and suicide methods. In the months before his death, Mason was struggling with a breakup and increased anxiety. Despite his family's efforts to seek help, they were unaware of the content he was consuming online. Mason liked and engaged with videos that included detailed descriptions of suicide, including one referencing rapper Lil Loaded, who had also died by suicide. His stepbrother reported that Mason joked about 'pulling a Lil Loaded' before his death. Mason's mother found his phone and noticed the disturbing content, but it appears TikTok's automated moderation system failed to detect and remove the videos.
K.S. hospitalized with heart failure after TikTok algorithm repeatedly pushed anorexia content to 13-year-old
K.S., a 13-year-old girl from Virginia, was hospitalized on January 24, 2022 after developing a severe eating disorder driven by TikTok's recommendation algorithm. Her mother had searched for healthy recipes and fitness content, but TikTok's algorithm began flooding K.S.'s feed with eating disorder-promoting videos. K.S. was admitted to hospital with a resting heart rate of 40-44 beats per minute — critically below the normal range of 60-100 — and underwent a 16-day re-feeding program. Her mother repeatedly deleted TikTok from K.S.'s phone only to find it reinstalled. Screenshots showed TikTok pushing eating disorder content to the minor without any search or input from the user. K.S. and her parents filed a personal injury lawsuit against TikTok and ByteDance in Los Angeles in 2022. The case is part of the Social Media Adolescent Addiction/Personal Injury Products Liability MDL.
TikTok Eating-Disorder Content and Its Impact on Teen Mental Health
The article highlights how TikTok is flooded with content related to eating disorders, particularly targeting teenagers. This content normalizes unhealthy behaviors and may contribute to the development of eating disorders and self-harm among young users. The focus is on the platform's role in spreading harmful content rather than a specific lawsuit or incident involving a particular individual.
French teenager dies by suicide after TikTok algorithm drives her into harmful content rabbit hole
Amnesty International France found that TikTok's algorithm continues to expose young users to harmful content related to depression and suicide through a 'rabbit hole' effect. Using test accounts, the organization observed that after searching for mental health content, users were shown increasingly depressive and romanticized suicide content. The report highlights cases like 15-year-old Marie Le Tiec, who died by suicide after viewing harmful TikTok content. TikTok denies responsibility, citing its moderation efforts and referral to mental health resources. Experts are divided on whether such content directly causes self-harm or primarily affects vulnerable users.
Two young girls die after TikTok algorithm promotes blackout challenge to their feeds
The parents of two young girls who died after participating in the 'blackout challenge' on TikTok have filed a lawsuit against the platform. The challenge, which involves choking oneself until passing out, was allegedly promoted by TikTok's algorithm to the children's 'For You' pages. The lawsuit claims that TikTok failed to warn users about the dangers and intentionally pushed harmful content. The children, Lalani Walton and Arriani Arroyo, were described as outgoing and active, with Lalani aspiring to be a rapper and Arriani enjoying sports and dance. TikTok has denied responsibility, stating the challenge predates the platform and that they would remove related content if found.
TikTok agrees to $92 million settlement in class action privacy lawsuit - Mashable
TikTok's parent company, ByteDance, agreed to a $92 million settlement in a class action lawsuit related to privacy violations under Illinois' biometric privacy laws. The lawsuit combined 21 separate cases into one filed in the U.S. District Court for the Northern District of Illinois. Plaintiffs claimed TikTok and its predecessor, Musical.ly, collected biometric data and used facial recognition technology in video filters to track users, including minors. The settlement requires TikTok to disclose data collection practices in its Privacy Policy and provide employee training on data privacy laws. The settlement is pending judicial approval and includes U.S. users who used TikTok or Musical.ly before February 25.
Caroline Koziol develops anorexia after TikTok and Instagram algorithm floods feed with extreme dieting content, joins landmark MDL
Caroline Koziol of Hartford, Connecticut began using Instagram and TikTok during the COVID-19 pandemic to search for at-home workouts and healthy recipes to support her swimming training. Within weeks, both platforms' recommendation algorithms had flooded her feeds with content promoting extreme workouts and disordered eating. 'One innocent search turned into this avalanche,' she said. Koziol, now 21, developed anorexia and is among more than 1,800 plaintiffs in the Social Media Adolescent Addiction/Personal Injury Products Liability MDL suing Meta and TikTok. She is not suing over specific content but over the platforms' defective recommendation design that maximized her engagement and drove her deeper into eating disorder content.
Liverpool pastor convicted of grooming and sexually abusing multiple girls via online platforms
An evangelical pastor, Walter Chahwanda, was jailed for nine years after being found guilty of 17 sexual offences against girls aged 14 to 17 between 2017 and 2020. The offences included sending indecent electronic images, sexual activity with children, and causing children to watch sexual acts. Chahwanda, who was based in Chester and later ran a church in Speke, Merseyside, groomed victims on TikTok and Snapchat, targeting girls as far afield as Manchester, South Yorkshire, the East Midlands, and Kent. He admitted to having sexualised conversations on social media but dismissed them as "naughty" role play. The court heard that the church was aware of complaints but failed to safeguard young girls. Chahwanda was sentenced at Liverpool Crown Court in 2023.
KGM sues Meta and Google over Instagram and YouTube addiction beginning at age 6, leading to depression and suicidal thoughts — first bellwether trial
A woman identified as KGM (Kaley G.M.) filed one of the first bellwether cases in the Social Media Adolescent Addiction MDL, alleging that Instagram and YouTube addiction beginning when she was approximately 6 years old led to clinical depression and suicidal thoughts. The lawsuit names Meta, Google, TikTok, and Snapchat, with Snap settling before trial. In January and February 2026, KGM's case became the first social media addiction case to proceed to jury trial in Los Angeles, with her mother Karen Glenn also testifying. Expert witnesses including Stanford psychiatry professor Anna Lembke testified that social media addiction is real and can cause or worsen anxiety, depression, and suicidal thoughts. The trial's outcome is expected to influence over 1,000 similar lawsuits.