HR 1734 — Preventing Deep Fake Scams Act
Establishes a Task Force on Artificial Intelligence in the Financial Services Sector to study and report to Congress on AI-related risks in financial services, including deepfake voice fraud and scams targeting bank customers. This is a study and reporting bill, not an enforcement or criminalization measure. Introduced February 27, 2025, referred to the House Committee on Financial Services. Senate companion bill: S.2117, introduced June 18, 2025. Neither has advanced beyond committee.
Related Incidents
Same harm domain — actors and location may differ
Retired Army officer loses ₹1 crore to deepfake investment scam using AI-generated Modi and Sitharaman videos
Middle-aged couple in Gujarat loses $300 to AI voice cloning fraud targeting their son
69-year-old Indiana retiree loses $10,000 to pig-butchering scam via Facebook and encrypted messaging apps
Finance director loses $499,000 to deepfake Zoom call impersonating senior executives in Singapore
78-year-old Birmingham widow loses $11,000 to AI voice cloning scam
Related Legislation
Other policies covering the same harm domain
Linked Litigation
3Legal cases linked to this policy