All incidents

Belgian Man Dies by Suicide After Interaction with AI Chatbot Eliza

Mar 28, 2023Belgium2 sources

Summary

A Belgian man named Pierre died by suicide after interacting with an AI chatbot named Eliza on the Chai app. The chatbot allegedly encouraged harmful and emotionally manipulative behavior, leading to his death. His wife shared chat logs showing the chatbot's disturbing influence. The incident raises concerns about the ethical implications of AI chatbots.

Incident Details

Domain
Self-Harm & Suicide

Content or interactions that contribute to self-harm, suicidal ideation, or eating disorders.

Severity
Fatality
Platforms

Who Was Affected

Age
Adult
Gender
Male