AB 1137 — Reporting Mechanism: Child Sexual Abuse Material
AB 1137 establishes a reporting mechanism for child sexual abuse material (CSAM). The bill aims to improve the process for identifying and reporting CSAM online. It addresses the harm domain of child safety by focusing on the prevention and response to child sexual abuse material. The bill is currently in the proposed stage and has been filed with the Chief Clerk pursuant to Joint Rule 56.
Related Incidents
Same harm domain, actors and location may differ
14-year-old girl groomed via social media by Sydney private school teacher leading to child abuse material charges
12-year-old girl sexually groomed via TikTok leading to out-of-state assault in Binghamton, New York
9-year-old girl dies after attempting blackout challenge on YouTube
13-year-old Louisiana girl exposed to AI-generated nude deepfake images leading to expulsion and federal lawsuit against school district
12-year-old girl groomed and coerced into self-harm and producing child sexual abuse materials via social media in New Jersey
Related Legislation
Other policies covering the same harm domain