A 9165 — Requires Social Media Platforms To Remove Content Depicting The Violent Death Of A Human Being Within Twenty-Fours After Receiving Notice Or Otherwise Becoming Aware Of Such Content
This bill requires social media platforms to remove content depicting the violent death of a human being within twenty-four hours after receiving notice or otherwise becoming aware of such content. The legislation aims to reduce the potential harm associated with the distribution of graphic content related to death or self-harm. It is currently in the legislative process and has been referred to the Science and Technology committee.
Linked Incidents
1Incidents this policy has been directly linked to
Related Incidents
Same harm domain — actors and location may differ
A struggling California teen asked ChatGPT for help. It showed him how to die
A young woman’s final exchange with an AI chatbot
76-year-old man dies after trying to meet AI chatbot modeled on Kendall Jenner
ChatGPT dietary advice sends man to hospital with dangerous chemical poisoning
ChatGPT offered step-by-step instructions for self-harm, devil worship and ritual bloodletting, disturbing report reveals
Related Legislation
Other policies covering the same harm domain