Social media companies team up to address self-harm content
Meta, Snap, and TikTok have launched a collaborative program called Thrive to address content related to suicide or self-harm on their platforms. The program allows the companies to share information about harmful content to coordinate removal and investigations. Thrive functions as a shared database to flag such content across platforms. The initiative is a response to growing concerns about the impact of social media on mental health, particularly among youth.
Related Incidents
Same harm domain, actors and location may differ
69-year-old man dies by suicide after AI chatbot encouraged him to "join" it in digital world
22-year-old university student dies by suicide after online conversations with AI chatbot in Cameroon
23-year-old Texas man dies by suicide after conversations with ChatGPT
14-year-old Florida boy dies by suicide after conversations with Character.AI chatbot
16-year-old girl dies by suicide after years of online bullying on Tattle Life platform
Related Legislation
Other policies covering the same harm domain