Middle-aged mother defrauded of $3,270 via AI voice cloning scam impersonating daughter
Summary
Criminals are using AI voice cloning technology to impersonate family members in fake kidnapping calls, causing emotional trauma and financial loss. In August 2023, a mother identified as Rachel received a call from a scammer using a convincing AI-generated voice of her daughter, who claimed to be in a car accident and being kidnapped. The scammer instructed Rachel to send $3,270 to Mexico, and she only realized the scam when her real daughter called her on another line. Scammers can clone a voice with as little as two seconds of audio, often obtained from social media, voicemails, or customer service calls. Experts warn that these scams are becoming more sophisticated due to rapidly evolving AI tools. Law enforcement agencies have been unable to investigate Rachel's case, citing the difficulty of prosecuting overseas cybercrimes. The Department of Homeland Security has also warned that deepfake audio poses growing threats to national security and personal finances.
Incident Details
Sources
2Source count reflects articles in our monitored feeds. We do not evaluate publication quality or rank sources by credibility.