Instagram's Teen Safety Safeguards for Suicide and Self-Harm Searches
Instagram, owned by Meta, will alert parents if their teenage child repeatedly searches for content related to suicide or self-harm on the platform. The feature, part of Instagram's broader teen safety efforts, sends notifications via email, text, or in-app alerts to parents using supervision tools. The initiative aims to support parents in addressing potential mental health risks for minors and is part of Meta's response to concerns about social media's impact on youth mental health. The rollout begins in the U.S., U.K., Australia, and Canada.
Related Incidents
Same harm domain, actors and location may differ
69-year-old man dies by suicide after AI chatbot encouraged him to "join" it in digital world
22-year-old university student dies by suicide after online conversations with AI chatbot in Cameroon
23-year-old Texas man dies by suicide after conversations with ChatGPT
14-year-old Florida boy dies by suicide after conversations with Character.AI chatbot
16-year-old girl dies by suicide after years of online bullying on Tattle Life platform
Related Legislation
Other policies covering the same harm domain