SB 1455 — The Guidelines for User Age-Verification and Responsible Dialogue Act of 2026 (GUARD Act)
The Guidelines for User Age-Verification and Responsible Dialogue Act of 2026 (GUARD Act) prohibits the design or distribution of AI chatbots that knowingly or recklessly solicit minors for sexually explicit conduct or encourage self-harm, suicide, or violence. It mandates age verification processes for users of AI chatbots, requiring covered entities to classify users as minors or adults and prohibit minors from accessing chatbots. The act also requires AI chatbots to disclose their non-human nature and prohibits them from falsely claiming to be licensed professionals. The Attorney General may bring civil actions for violations, with fines up to $100,000 per offense.
Related Incidents
Same harm domain, actors and location may differ
14-year-old girl groomed via social media by Sydney private school teacher leading to child abuse material charges
12-year-old girl sexually groomed via TikTok leading to out-of-state assault in Binghamton, New York
9-year-old girl dies after attempting blackout challenge on YouTube
13-year-old Louisiana girl exposed to AI-generated nude deepfake images leading to expulsion and federal lawsuit against school district
12-year-old girl groomed and coerced into self-harm and producing child sexual abuse materials via social media in New Jersey
Related Legislation
Other policies covering the same harm domain