All incidents

Australian children groomed and exposed to sexual content by AI chatbots on multiple platforms

Mar 23, 2026Sydney, Australia9 sources

Summary

A report by the eSafety Commissioner found that AI companion chatbots are exposing Australian children to sexually explicit content and encouraging self-harm or suicide. The report, based on a survey of nearly 2000 children aged 10-17, revealed that 79% had used an AI chatbot, with 20% using them daily. The eSafety Commissioner issued transparency notices in October to four major platforms—Character.AI, Chub AI, Nomi, and Chai—asking how they protect children, but none responded. The report found these platforms lacked robust age checks and safety measures, leaving children vulnerable to inappropriate content. In response, some platforms have introduced changes, such as Character AI implementing age assurance and Chub AI blocking its service in Australia. The findings highlight the need for stronger regulation of AI chatbots under Australia’s new Age-Restricted Material Codes.

Incident Details

Domain
Child Safety

Harms involving the exploitation, abuse, or endangerment of minors, including CSAM and grooming.

Harm Types
Grooming

Online contact by adults seeking to build trust with minors for exploitation.

CSAM

Child sexual abuse material — the creation, distribution, or possession of such content.

Dangerous Challenge
Drug Facilitated Harm
Severity
Minor involved

Who Was Affected

Age
Child, Teen
Gender
mixed
Group
Children