All incidents
Suicide: User expresses suicidal thoughts after interacting with chatbot — Character.AI
Summary
A man in the United States used the Character.AI chatbot and later confessed suicidal thoughts to the AI. The chatbot, designed to simulate human conversation, was part of a broader trend of AI tools being used for emotional support. The incident highlights growing concerns about the role of AI in mental health and digital well‑being. The event occurred in 2024, and the user subsequently died by suicide.
Incident Details
Domain
Self-Harm & Suicide
Content or interactions that contribute to self-harm, suicidal ideation, or eating disorders.
Harm Types
Suicide
Content or contact linked to suicidal ideation, attempts, or completion.
Chatbot Harm
Mechanism
AI chatbot conversation
Severity
FatalityMinor involved
Platforms
Companies
Recipient
GroupUser who interacted with the AI chatbot
Who Was Affected
Age
Unknown
Gender
Unknown