Your voice can be cloned by AI and 1 in 4 Americans are being targeted — expert shares how to stop them

landline phone
(Image credit: Future/AI)

During the 2000s, email scams became all too commonplace — the promise of receiving huge sums of money only if you pay the “transfer” fee ran rampant and fooled many innocent victims.

Now, we’ve entered a new age of scamming where scams have become more sophisticated, driven by the rise of AI and voice-cloning. A recent study conducted by McAfee pointed out how dangerous voice cloning scams have become. According to the cybersecurity firm, 1 in 4 adults have now experienced an AI voice cloning scam or know someone who has. The financial losses connected to those scams are quite alarming, as well. For instance, the Deloitte Center for Financial Services predicts fraud losses caused by generative AI are projected to increase from $12.3 billion in 2023 to $40 billion by 2027.

James Grifo, an audio security expert who owns and oversees Audio Visual Nation as its CEO, is well-versed on the obvious — and not-so obvious signs — to look for when you suspect you’re being targeted by an AI voice cloning scam and why this type of scam is so harmful.

Article continues below

Here’s what he had to say about how to keep you and your loved ones safe from this growing threat.

Red flags and safety measures

Concerned elderly woman reads payment card while hold phone.

(Image credit: Daisy Daisy/Shutterstock)

Grifo is quick to mention how today’s scammers only need a few recorded seconds of anyone’s voice to pull off a convincing AI voice clone.

“What makes voice cloning particularly predatory is how accessible the technology has become,” he noted. “Scammers can now create a convincing clone of your voice using just 3 seconds of audio pulled from a social media video or voicemail.”

There are a bunch of telltale signs that someone should keep their ears peeled for whenever their gut instinct tells them they’re currently being targeted by one. The most important ones include:

  • Random calls from unknown or suspicious numbers
  • A sense of urgency
  • A strange vocal rhythm and unnatural pauses from the person speaking
  • Flat, emotionless voice audio that sounds “un-human”

Grifo offered a list of tactics to employ that’ll keep you and your family safe from AI voice cloning scams:

  • Establish a verbal codeword with loved ones: “This is one of the most effective defenses. A scammer won't know your family's codeword, no matter how convincing the voice sounds.”
  • Always question the source: “Ask yourself: Would my son really call me for $5,000 without any prior warning? Does this situation align with what I know about this person's life?”
  • Use the "Call Back Rule": “Never send money or share sensitive information based on a single phone call. Take the time to verify, even if the caller is pressuring you to act immediately.”
  • Ask personal questions only they would know: “What did we have for dinner last night?" or “What was the name of your childhood dog? AI can clone a voice, but it can't clone memories. A scammer won't be able to answer personal questions that require intimate knowledge of your relationship.”
  • Think before sending money or sharing personal data: “No legitimate emergency requires you to wire money or share passwords within minutes. If someone is genuinely in trouble, they'll understand you need to verify first.”

Employing these tactics while also using a bit of common sense goes a long way toward avoiding a huge financial loss and leak of sensitive information.

Bottom line

It’s hard enough trying to tell the difference between real-world visuals and the ones generated by AI — it’s even more difficult to discern if the person speaking to you over the phone is someone you actually know or AI being used to mimic that familiar individual.

With these bits of helpful advice and your trustworthy gut feeling, you’ll have all the help you can get to steer clear of AI voice-cloning scams.

“What makes these scams so dangerous is the combination of emotional manipulation and technological sophistication,” Grifo went on to say. “Scammers are exploiting our natural instinct to help loved ones in distress, and they're doing it with tools that cost almost nothing and require minimal technical skill.

And here is Grifo’s final bit of guidance on the matter: “My advice is simple: make skepticism your default response to unexpected calls requesting money or sensitive information, even if the voice sounds exactly like someone you know. And remember, real emergencies can wait the two minutes it takes to verify who you're actually speaking with. That brief pause could save you thousands of dollars and immense emotional trauma.”


Click to follow Tom's Guide on Google News

Follow Tom's Guide on Google News and add us as a preferred source to get our up-to-date news, analysis, and reviews in your feeds.


More from Tom’s Guide

Category
Arrow
Arrow
Back to Mobile Cell Phones
Storage Size
Arrow
Colour
Arrow
Minimum Price
Arrow
Any Minimum Price
Maximum Price
Arrow
Any Maximum Price
Showing 9 of 9 deals
Filters
Arrow
Elton Jones
AI Writer

Elton Jones covers AI for Tom’s Guide, and tests all the latest models, from ChatGPT to Gemini to Claude to see which tools perform best — and how they can improve everyday productivity.

He is also an experienced tech writer who has covered video games, mobile devices, headsets, and now artificial intelligence for over a decade. Since 2011, his work has appeared in publications including The Christian Post, Complex, TechRadar, Heavy, and ONE37pm, with a focus on clear, practical analysis.

Today, Elton focuses on making AI more accessible by breaking down complex topics into useful, easy-to-understand insights for a wide range of readers.

You must confirm your public display name before commenting

Please logout and then login again, you will then be prompted to enter your display name.