AI-generated child sexual abuse material overwhelms law enforcement in Indiana
Summary
Law enforcement agencies in Indiana are struggling to manage a surge in AI-generated child sexual abuse material (CSAM). Cases include a Fishers pastor's son accused of creating AI-generated photos of nude pregnant toddlers, an Elwood school custodian altering a student's Instagram photo, and a 71-year-old Evansville man convicted of using AI to generate explicit images of children under 12. Reports of AI-fueled CSAM increased from 4,700 in 2023 to over 1 million in the first nine months of 2025, according to the National Center for Missing and Exploited Children. These reports are sent to Indiana State Police’s Internet Crimes Against Children Task Force for investigation. Prosecutors and law enforcement warn that the growing volume of AI-generated content is overwhelming already overburdened forensic teams and that additional funding and resources are needed to address the crisis.
Incident Details
Harms involving the exploitation, abuse, or endangerment of minors, including CSAM and grooming.
Child sexual abuse material — the creation, distribution, or possession of such content.