All incidents

AI-generated child sexual abuse material overwhelms law enforcement in Indiana

Dec 1, 2024Evansville, Indiana1 source

Summary

Law enforcement agencies in Indiana are struggling to manage a surge in AI-generated child sexual abuse material (CSAM). Cases include a Fishers pastor's son accused of creating AI-generated photos of nude pregnant toddlers, an Elwood school custodian altering a student's Instagram photo, and a 71-year-old Evansville man convicted of using AI to generate explicit images of children under 12. Reports of AI-fueled CSAM increased from 4,700 in 2023 to over 1 million in the first nine months of 2025, according to the National Center for Missing and Exploited Children. These reports are sent to Indiana State Police’s Internet Crimes Against Children Task Force for investigation. Prosecutors and law enforcement warn that the growing volume of AI-generated content is overwhelming already overburdened forensic teams and that additional funding and resources are needed to address the crisis.

Incident Details

Domain
Child Safety

Harms involving the exploitation, abuse, or endangerment of minors, including CSAM and grooming.

Harm Types
CSAM

Child sexual abuse material — the creation, distribution, or possession of such content.

Mechanism
content
Severity
Minor involved
Platforms
Recipient
GroupChildren depicted in AI-generated or altered child sexual abuse material, including real children whose images have been manipulated
Dimensions
physicalpsychologicalreputationalautonomydiscriminatory

Who Was Affected

Age
Child
Gender
Unknown
Group
Children