How To Protect Yourself From AI Voice Cloning

With the rapid advancement of AI comes new ways to commit scams and fraud. One of the most insidious threats involves AI voice cloning, a technology that allows scammers to perfectly mimic the voice of a loved one and use that emotional connection to manipulate victims, particularly seniors.

On first thought, if a loved one called in a panic claiming an urgent emergency, your first instinct would be to help immediately. You wouldn’t stop to question the authenticity of their voice. This immediate, fear-driven emotional response is what scammers abuse to extract money and sensitive information.


Examples of AI Cloning Scams

 

  • The Grandchild in Distress: A cloned voice of your grandchild calls, claiming they’ve been arrested, involved in a car accident, or kidnapped, urgently needing money for bail, medical bills, or a lawyer’s retainer.
  • The Executive Impersonation: The voices of company executives or top leaders can be cloned to fool key personnel into transferring large sums of money or proprietary information to a fake account controlled by the scammers.
  • The Vague, Urgent Request: In many reported incidents, the cloned voice provides just enough detail to sound plausible but keeps the specifics vague. They might claim, “I’m in trouble and I need you to wire the money to my lawyer right away. Please don’t tell anyone.” The urgency and the request for secrecy are key psychological levers used to prevent the victim from hanging up and calling the real person.


How Scammers Get the Voice Sample


You might be wondering how a scammer obtains the voice of your loved one. The chilling truth is that AI voice cloning technology requires surprisingly little audio—sometimes as little as three seconds of clear speech—to create a functional, realistic clone. Scammers primarily source this audio in two ways: by finding their voice online or by calling them directly.


Sourcing Pre-Recorded Audio (Passive Methods)


This is often the easiest and most passive method for a scammer:

  • Public Social Media Posts: Videos, voice notes, or public-facing stories on platforms like TikTok, Facebook, and Instagram are a goldmine. Any clip where your loved one is speaking can be scraped and fed into an AI voice cloning tool.
  • Public Voicemail Greetings: A long, personalized, public voicemail greeting offers a perfect, high-quality sample of a person’s voice, pitch, and cadence.
  • Podcasts or Online Interviews: Any public audio where a person is speaking clearly provides the high-quality samples needed for cloning.

Capturing Audio Directly (Active Methods)


Scammers don’t always need pre-recorded audio. They can try to capture a voice sample directly:

  • Initial “Vishing” Calls: A scammer might call a person—using a spoofed number to look legitimate—with an innocent-sounding prompt, like a survey or a “wrong number” scenario. Their goal is simply to get the target to say a few words, such as “Hello?” or “Who is this?”
  • Data Brokers: By simply looking up a name and address on a data broker website, a scammer can find contact information that may not be public, giving them a direct line to target for the initial voice-sampling call.


How to Deal with an AI Voice Scam


The best defense against this kind of emotional manipulation is to Stop, Hang Up, and Verify.


Even if a loved one is calling in a panic, they will usually slow down if you tell them you need a minute or a specific verifiable detail. A scammer, however, will usually keep the same tempo or even speed up to overwhelm you. A genuine loved one, while panicked, can usually provide concrete facts; a scammer will be vague and demand absolute secrecy, which should be an immediate red flag.


To make sure you are certain the caller is your loved one and not a scammer, follow these steps:

  1. Hang Up and Call Back on a Known Number. Say, “I’m going to hang up, but I’ll call you right back on your other number.” This is the ultimate safety measure. You acknowledge their distress but enforce your security protocol. A genuine loved one will understand the need for a known line of communication.
  2. Use a Secret Question or Safe Word. Establish a safe word or question with your family members in advance that only you and they know. If you haven’t done this, ask them something only they would know. If they are truly your loved ones, they will understand you are simply confirming their safety before you commit large sums of money.
  3. Ask for Non-Public Details. Ask a question requiring knowledge that is not available anywhere online, especially on social media.
    Examples: “What was the flavor of the birthday cake we had for your 10th birthday?” or “What was the name of the high school band we saw last summer?”
  4. Propose a Secure, Traceable Solution. If they ask for money, calmly reply, “I will not send a gift card, but I can check with the bank about a cashier’s check, or I can drive down there myself.” Scammers immediately panic at the thought of traceable transactions (like a bank transfer) or an in-person meeting. A genuine loved one will be relieved that help is coming in any form.


Do you have any specific questions about how to set up a safe word with your family, or would you like to know how to adjust your social media settings to limit voice exposure? Contact us and we would be able to help you out!