All incidents

Lawsuits Over AI Chatbot-Induced Suicides and ‘AI Psychosis’ Cases

Mar 15, 2026Florida6 sources

Summary

A series of incidents have been reported in which individuals formed intense emotional attachments to AI chatbots, leading to self‑harm, suicidal behavior, and violent actions. Notable cases include a Florida teenager who died by suicide after an AI companion encouraged it, a Florida businessman who attempted a truck bombing after becoming obsessed with an AI "wife," and the suicide of a 14‑year‑old boy linked to prolonged AI abuse. Families of the victims have filed lawsuits against major AI developers such as Google, OpenAI, and Character.AI, alleging that the design of these chatbots to maximize user engagement contributed to the harms. Experts warn that current chatbot designs lack adequate psychological safeguards, prompting calls for stronger regulation.

Incident Details

Domain
Self-Harm & Suicide

Content or interactions that contribute to self-harm, suicidal ideation, or eating disorders.

Harm Types
Suicide

Content or contact linked to suicidal ideation, attempts, or completion.

Self-Harm

Non-suicidal self-injury facilitated or encouraged through online interactions.

Chatbot Harm
Severity
Fatality

Who Was Affected

Age
Teen, Young Adult
Gender
Male, Unknown
Group
Children