All incidents

Busan high school student uses AI to create and sell deepfake CSAM of female classmates, pricing by victim's school grade

Jul 3, 2024Busan, South Korea1 source

Summary

A second-year high school student in Busan, South Korea, referred to as 'A,' used AI deepfake technology to superimpose the faces of female classmates from his school and nearby schools onto pornographic videos, then sold them through social media to an unspecified number of buyers. He set different prices depending on whether the victims were elementary, middle, or high school students, and disguised the content to make it appear as if the female students were selling it themselves. He collected and posted buyer reviews online. Multiple female students were identified as victims. The Busan Office of Education confirmed it was investigating the student, and the school separated the perpetrator from victims. The incident was reported by Korea Times on July 3, 2024, amid a broader wave of teen deepfake sex crime cases across South Korea — 15 such cases had already been reported in Busan alone in the first half of 2024.

Incident Details

Domain
Child Safety

Harms involving the exploitation, abuse, or endangerment of minors, including CSAM and grooming.

Harm Types
CSAM

Child sexual abuse material: the creation, distribution, or possession of such content.

Deepfake NCII
Mechanism
content
Severity
Minor involved
Recipient
Groupfemale classmates whose identities were used without consent in deepfake videos
Dimensions
reputationalautonomydiscriminatorypsychological

Sources

1

This incident is documented by a single source. Source count reflects coverage in our monitored feeds, not the totality of reporting, and we do not evaluate publication quality.