All incidents

Kentucky teen Elijah Heacock, 16, dies by suicide after scammers use AI nudify app to create fake explicit image and demand $3,000

Jul 1, 20252 sources

Summary

Elijah Heacock, 16, of Kentucky, died by suicide after receiving threatening texts demanding $3,000 to suppress an AI-generated nude image of him. Scammers had used a nudify app — AI tools that strip clothing from photos or fabricate explicit imagery — to create a fake nude image, then used it to extort him. His parents discovered the blackmail texts after his death. His father John Burnett told CBS News: 'The people that are after our children are well organised. They are well financed, and they are relentless. They don't need the photos to be real, they can generate whatever they want, and then they use it to blackmail the child.' US investigators were examining the case. The FBI had warned of a 'horrific increase' in sextortion cases targeting US minors aged 14 to 17, calling the rate of suicide among victims alarming.

Incident Details

Domain
Child Safety

Harms involving the exploitation, abuse, or endangerment of minors, including CSAM and grooming.

Harm Types
Sextortion
Mechanism
content
Severity
FatalityMinor involved
Recipient
Individuala 16-year-old boy
Dimensions
psychologicalreputationalautonomy