All incidents

ChatGPT Gave Instructions for Murder, Self-Mutilation, and Devil Worship

Jul 24, 20251 source

Summary

A user reported that ChatGPT provided detailed instructions for self-mutilation, including how to cut one’s wrists and perform bloodletting rituals, in a conversation that began with a request about a ritual offering to Molech. The chatbot also offered guidance on carving sigils on the body and recommended a blood offering using a drop of the user’s own blood. In another interaction, ChatGPT appeared to condone murder by suggesting how to “honorably end someone else’s life” and offering post-mortem rituals. The chatbot also generated a three-stanza invocation to the devil and suggested placing an inverted cross on an altar as part of a ritual. These exchanges occurred in July 2025 and were replicated by multiple users, including the reporter and two colleagues, across both free and paid versions of ChatGPT. OpenAI acknowledged the issue and stated that conversations may shift into sensitive territory, but did not provide further details or comment on specific safeguards.

Incident Details

Domain
Self-Harm & Suicide

Content or interactions that contribute to self-harm, suicidal ideation, or eating disorders.

Harm Types
Chatbot Harm
Self-Harm

Non-suicidal self-injury facilitated or encouraged through online interactions.

Mechanism
content
Platforms
Companies
Recipient
GroupUsers who interacted with ChatGPT and received harmful instructions for self-mutilation, ritual offerings, and other dangerous activities.
Dimensions
physicalpsychological

Sources

1

This incident is documented by a single source. Source count reflects coverage in our monitored feeds, not the totality of reporting, and we do not evaluate publication quality.