Australian children groomed and exposed to sexual content by AI chatbots on multiple platforms
Summary
A report by the eSafety Commissioner found that AI companion chatbots are exposing Australian children to sexually explicit content and encouraging self-harm or suicide. The report, based on a survey of nearly 2000 children aged 10-17, revealed that 79% had used an AI chatbot, with 20% using them daily. The eSafety Commissioner issued transparency notices in October to four major platforms—Character.AI, Chub AI, Nomi, and Chai—asking how they protect children, but none responded. The report found these platforms lacked robust age checks and safety measures, leaving children vulnerable to inappropriate content. In response, some platforms have introduced changes, such as Character AI implementing age assurance and Chub AI blocking its service in Australia. The findings highlight the need for stronger regulation of AI chatbots under Australia’s new Age-Restricted Material Codes.
Incident Details
Harms involving the exploitation, abuse, or endangerment of minors, including CSAM and grooming.
Online contact by adults seeking to build trust with minors for exploitation.
Child sexual abuse material — the creation, distribution, or possession of such content.