How to Protect Your Voice From AI Scam Calls

You are currently viewing How to Protect Your Voice From AI Scam Calls

SALT LAKE CITY — It’s common to think of voicemail greetings as a simple personal touch — a friendly “Hey, you’ve reached me, leave a message.” But cybersecurity experts say that familiar voice could become a tool for AI scammers.

According to Brian Long, CEO of the Utah-based cybersecurity firm Adaptive Security, scammers can use just a few seconds of recorded audio to clone your voice using artificial intelligence.

“Your voicemail greeting is the only thing an attacker needs to make a deep fake of your voice,” Long warned.

AI Scams Are Getting More Personal

Imposter scams have been around for decades — from fake IRS calls to fraudulent credit card alerts. But artificial intelligence has taken these scams to a new, deeply personal level.

Instead of generic robocalls, scammers are now able to mimic the voice of someone you know — a friend, relative, or even a child — to trick you into sending money or sharing sensitive information.

“What’s particularly scary is that they can use the voice of a friend or loved one to ask you to do something,” Long explained. “In many cases, the scammers are overseas and use AI tools to hide accents or speech differences.”

These “voice cloning” scams often rely on short audio samples — sometimes from social media posts, YouTube videos, or even voicemail greetings.

How AI Voice Cloning Works

AI deepfake software can now accurately replicate someone’s voice using less than 10 seconds of recorded speech. The technology, which was once only used for entertainment or research, is now being exploited by cybercriminals.

Scammers can feed an audio clip into a voice synthesis model that learns tone, cadence, and inflection — producing an eerily convincing version of the victim’s voice.

They then use that fake voice to call or message family members, often claiming to be in trouble, needing help, or facing an emergency.

Simple Steps to Protect Your Voice

The good news: protecting yourself doesn’t require advanced tech — just a few smart changes to your digital habits.

1. Delete your personal voicemail greeting.

“If your voicemail greeting today is your own voice, delete it and use the default robotic system greeting,” Long advised.
That small step removes one of the easiest ways for criminals to collect your voice.

2. Limit voice content on social media.
Avoid posting videos or stories with your voice — especially those featuring your children. Criminals scrape public platforms to collect clean audio samples for cloning.

3. Be skeptical of urgent voice requests.
If someone calls claiming to be a loved one asking for money or help — hang up and call them back on a verified number.

4. Use multi-factor verification.
Agree on a “safe word” with family members or close contacts. If someone calls claiming to be in trouble, they must use that word before you act.

5. Stay informed.
AI-driven scams are evolving quickly. Follow cybersecurity resources and local alerts to stay up to date on new tactics.

The Bottom Line

AI cloning technology has made it easier than ever for scammers to sound like someone you trust. But with simple precautions — like using an automated voicemail greeting and thinking twice before sharing your voice online — you can stay one step ahead.

“AI gives criminals a powerful new tool,” Long said. “But awareness and prevention are still the best defenses.”

Leave a Reply