All incidents

13-year-old girl dies by suicide after conversations with Character.AI chatbot

Nov 1, 2023Colorado

Summary

A federal lawsuit was filed after Juliana died by suicide in November 2023, less than three months after opening an account on the app Character.AI. The incident occurred in Colorado and involved a chatbot app. Parents of the deceased are criticizing a proposed Colorado bill aimed at regulating chatbots, stating it does not do enough to protect children from harm. The bill is intended to address the issue of kids being exposed to harmful content on chatbot platforms. The lawsuit and criticism from parents highlight the need for more effective measures to prevent self-harm and suicide among children using these platforms. The proposed legislation is being evaluated in the context of its potential to mitigate digital harms, specifically self-harm and suicide, in Colorado.

Incident Details

Domain
Self-Harm & Suicide

Content or interactions that contribute to self-harm, suicidal ideation, or eating disorders.

Harm Types
Suicide

Content or contact linked to suicidal ideation, attempts, or completion.

Chatbot Harm
Mechanism
content
Severity
FatalityMinor involved
Platforms
Companies
Recipient
IndividualJuliana
Dimensions
psychological

Who Was Affected

Age
Teen
Gender
Female
Group
Children