All incidents

Taylor Swift non-consensual AI deepfake pornography spreads on X, prompting legislative action

Jan 29, 20243 sources

Summary

In early 2026, AI‑generated pornographic deepfake images of singer Taylor Swift were widely shared on the social media platform X, with one post reaching over 47 million views before the account was suspended. X temporarily blocked searches for Swift’s name and reinstated content‑moderation measures, while the White House and Swift’s fans condemned the abuse. The incident spurred bipartisan congressional efforts, including the No AI FRAUD Act, to criminalize the creation and distribution of non‑consensual deepfake imagery. State lawmakers also highlighted the patchwork of existing protections, citing California and New York laws that already provide civil remedies for deepfake victims.

Incident Details

Domain
Privacy & Surveillance

Unauthorized collection, tracking, or exposure of personal data and private information.

Harm Types
Deepfake NCII
Non-Consensual Imagery
Platforms
Companies

Who Was Affected

Age
Adult
Gender
Female
Group
Women & Girls