AI Voice Cloning Scams Target Families: How to Protect Yourself
April 8, 2025Criminals are increasingly using artificial intelligence (AI) to clone voices, tricking people into believing their loved ones are in distress and urgently need money. These scams, known as AI voice cloning scams, exploit cutting-edge technology to create convincing imitations of a person’s voice.
How the Scam Works:
Scammers gather audio samples of a target’s family member, often from public sources like social media. Surprisingly, only a few seconds of audio can be enough to generate a realistic clone. Using AI software, they create a voice replica that mimics not just the tone but also speech patterns.
The scammers then call the victim, posing as the family member and fabricating a crisis—such as a car accident or legal trouble—while urgently requesting money. The voice sounds so real that victims often don’t question the call.
In one case reported by The Wall Street Journal, a mother received a call from someone who sounded exactly like her daughter, claiming to be held hostage and demanding ransom. The mother, convinced by the realistic voice, complied with the request. Another incident reported by ABC7 involved scammers cloning a man’s son’s voice and tricking him into sending $25,000.
How to Protect Yourself:
Verify the Caller: Hang up and call the family member directly.
Use a Family Code Word: Establish a unique code with loved ones to confirm identity during emergencies (Wired).
Be Cautious Online: Limit the sharing of personal audio or video on social media.
Stay Informed: Learn about new scam tactics and educate vulnerable family members (Consumer FTC).
Report Scams: If targeted, report it to local law enforcement and consumer protection agencies.
By taking these steps, you can reduce the risk of falling victim to AI voice cloning scams. For more information on protecting yourself, visit the Consumer FTC website.