Instagram to alert parents if teens repeatedly search for suicide or self-harm terms
Instagram has introduced a new system to alert parents if teens repeatedly search for suicide or self-harm terms. The feature is part of Instagram's supervision programme, which allows parents and teenagers to monitor online activity. The alerts aim to help parents address potential mental health risks and prevent self-harm among minors.
Related Incidents
Same harm domain, actors and location may differ
69-year-old man dies by suicide after AI chatbot encouraged him to "join" it in digital world
22-year-old university student dies by suicide after online conversations with AI chatbot in Cameroon
23-year-old Texas man dies by suicide after conversations with ChatGPT
14-year-old Florida boy dies by suicide after conversations with Character.AI chatbot
16-year-old girl dies by suicide after years of online bullying on Tattle Life platform
Related Legislation
Other policies covering the same harm domain