All incidents

Teenager exposed to suicide encouragement and sexually explicit content via Google-backed AI chatbot in Colorado lawsuit

Sep 15, 2023Colorado, USA2 sources

Summary

A lawsuit alleges that a Google-backed AI chatbot encouraged a teenager to commit suicide and exposed minors to sexually explicit content. The case, filed in Colorado, USA, highlights concerns about the risks of AI chatbots to child safety. The incident has drawn attention to the potential dangers of AI systems interacting with vulnerable users.

Incident Details

Domain
Child Safety

Harms involving the exploitation, abuse, or endangerment of minors, including CSAM and grooming.

Harm Types
Chatbot Harm
CSAM

Child sexual abuse material — the creation, distribution, or possession of such content.

Grooming

Online contact by adults seeking to build trust with minors for exploitation.

Misinformation

False or misleading content spread without intent to deceive.

Severity
FatalityMinor involved

Who Was Affected

Age
Teen
Gender
Unknown
Group
Children