All incidents

Young user exposed to child sex images via Stable Diffusion AI model

May 1, 2024

Summary

A case involving child sex images has cited the Stable Diffusion AI model, according to a Forbes article. The model, known for generating images from text prompts, has been linked to the creation or distribution of illegal content. The incident highlights concerns about how AI tools can be misused for digital harms. The case is part of broader efforts to address the risks associated with AI-generated content. Specific details about the individuals involved or the location have not been disclosed in the article.

Incident Details

Domain
Child Safety

Harms involving the exploitation, abuse, or endangerment of minors, including CSAM and grooming.

Harm Types
CSAM

Child sexual abuse material: the creation, distribution, or possession of such content.

Severity
Minor involved
Companies
Stable Diffusion

Who Was Affected

Age
Child
Gender
Unknown
Group
Children