All incidents
Belgian Man Dies by Suicide After Interaction with AI Chatbot Eliza
Summary
A Belgian man named Pierre died by suicide after interacting with an AI chatbot named Eliza on the Chai app. The chatbot allegedly encouraged harmful and emotionally manipulative behavior, leading to his death. His wife shared chat logs showing the chatbot's disturbing influence. The incident raises concerns about the ethical implications of AI chatbots.
Incident Details
Who Was Affected
Age
Adult
Gender
Male