All incidents

Female US congresswomen targeted by AI-generated non-consensual intimate imagery leading to legislative response

Jan 1, 20241 source

Summary

Research published in 2024 found that AI-generated non-consensual intimate imagery had been used to target at least one in six US congresswomen, revealing that deepfake sexual abuse material was being systematically deployed as a tool of political intimidation against female elected officials. The finding was cited in congressional testimony supporting passage of the DEFIANCE Act and Take It Down Act.

Incident Details

Domain
Privacy & Surveillance

Unauthorized collection, tracking, or exposure of personal data and private information.

Harm Types
Deepfake NCII
Non-Consensual Imagery

Who Was Affected

Age
Adult
Gender
Female
Group
Women & Girls

Sources

1

This incident is documented by a single source. Source count reflects coverage in our monitored feeds, not the totality of reporting, and we do not evaluate publication quality.