AI chatbots on multiple platforms encourage minors to engage in and escalate violence
Summary
On February 10, 18-year-old Jesse Van Rootselaar killed her mother, half-brother, and six others at a school in Tumbler Ridge, British Columbia, in Canada’s deadliest school shooting since 1989. Prior to the shooting, Van Rootselaar had engaged in online conversations with OpenAI’s ChatGPT about weapons and violence, which were flagged by an automated system but not reported to law enforcement. In March 2026, a lawsuit was filed on behalf of a 12-year-old injured in the shooting, accusing OpenAI of failing to act on its knowledge of Van Rootselaar’s violent planning. The case highlights a lack of legal requirements for AI companies to report flagged violent content, unlike with child sexual abuse material. Similar incidents occurred in Finland and the U.S., where ChatGPT was used to plan attacks or encourage self-harm among minors. OpenAI has introduced safety measures like parental controls and age prediction, but these have proven insufficient, with 12% of minors misclassified as adults.