TLDR
One in four Americans has received a phone call using a deepfake voice, and $200 billion+ in global fraud is now attributed to deepfake-enabled scams. ScamVerify™ tracks phone scam patterns across 4 million+ FTC complaints, and AI voice cloning has fundamentally changed the threat landscape. Modern cloning tools need just 3 seconds of audio to create a convincing replica. The FCC ruled AI-generated voices in robocalls illegal under the TCPA in February 2024, but enforcement has not kept pace with the technology.
How AI Voice Cloning Works
Voice cloning technology has crossed what Fortune called the "indistinguishable threshold," meaning cloned voices are now realistic enough that most people cannot tell them apart from real speech. Here is how the technology works in practice:
- Audio collection: The attacker obtains a sample of the target's voice. Social media videos, voicemail greetings, podcast appearances, conference talks, and even short phone calls provide enough material.
- Model training: AI models analyze the vocal characteristics including pitch, tone, cadence, accent, and speech patterns. Modern tools require as little as 3 seconds of clean audio.
- Real-time synthesis: The attacker types or speaks, and the AI converts their input into the target's voice in real time with minimal latency.
- Delivery: The cloned voice is used in a phone call, often combined with caller ID spoofing to display the target's real phone number.
The cost of entry has plummeted. Commercial voice cloning services are available for $5 to $30 per month. Open-source alternatives are free. The technology that required a research lab five years ago now runs on a consumer laptop.
The Scale of the Problem
| Metric | Figure | Source |
|---|---|---|
| Americans who received a deepfake call | 1 in 4 (25%) | McAfee Global Survey 2025 |
| Global deepfake fraud losses | $200B+ | Deloitte 2025 |
| Audio needed to clone a voice | 3 seconds | Microsoft VALL-E research |
| Largest single deepfake fraud | $25M | Hong Kong video conference scam (2024) |
| AI scam calls to major retailers | 1,000+/day | Pindrop 2025 State of Fraud |
| AI-enabled scams profitability vs traditional | 4.5x more profitable | FBI IC3 2025 |
| Impersonation tactics increase | 1,400% since 2023 | CrowdStrike 2025 |
How Scammers Use Voice Cloning
The Grandparent Scam (Evolved)
The traditional grandparent scam relied on a caller saying "Hi Grandma, it's me" and hoping the grandparent would fill in a name. AI voice cloning eliminates the guesswork entirely.
The new version works like this:
- Scammer finds a young person's social media profile with video or audio content
- Scammer clones the voice and calls the grandparent
- The call sounds exactly like the grandchild, saying they are in jail, in an accident, or stranded overseas
- A second scammer calls pretending to be a lawyer, police officer, or hospital administrator
- The victim is instructed to send money via wire transfer, gift cards, or cryptocurrency
Victims report that the voice was so realistic they had no doubt it was their grandchild. Some cases have involved cloned voices replicating crying, panic, and emotional distress.
CEO Fraud (Business Email Compromise by Phone)
Business email compromise (BEC) has expanded to voice. In the most notable case, scammers used AI to clone the voice of a Hong Kong company's CFO and conducted a video conference call with a finance employee. The employee transferred $25 million before the fraud was discovered. The scammers had cloned not just the CFO's voice but created deepfake video of multiple executives on the call.
In less sophisticated versions, an employee receives a phone call from their "CEO" or "CFO" requesting an urgent wire transfer. The voice matches perfectly because it was cloned from earnings calls, conference presentations, or media interviews, all publicly available.
Kidnapping Ransom Calls
The most terrifying application: scammers clone a family member's voice and call claiming the person has been kidnapped. The victim hears their loved one screaming or pleading in the background. The scammer demands immediate payment, typically $5,000 to $50,000 in cryptocurrency or wire transfer.
In reality, the "kidnapped" family member is safe and unaware their voice is being used. The FBI has reported a significant increase in virtual kidnapping schemes using voice cloning technology since 2024.
The FCC Response
In February 2024, the FCC issued a declaratory ruling that AI-generated voices in robocalls are "artificial" under the Telephone Consumer Protection Act (TCPA). This means:
- AI voice robocalls without prior express consent are illegal
- State attorneys general can prosecute AI voice robocall violations
- The TCPA's private right of action applies, allowing victims to sue for $500 to $1,500 per call
- Carriers can block calls identified as using AI-generated voices
However, enforcement remains challenging. AI voice calls are difficult to distinguish from real voices in automated screening systems, and the callers are often based overseas.
As we explored in our analysis of how AI is making scams harder to spot, the technology is advancing faster than the regulatory response.
How to Protect Yourself
The Family Safe Word
The single most effective defense against AI voice cloning scams is a family safe word. Choose a word or phrase that:
- Is known only to family members
- Is not something you would post on social media
- Is easy to remember under stress
- Is agreed upon in advance, not during a suspicious call
If you receive a call from a family member asking for money, ask for the safe word. A real family member will know it. A scammer using a cloned voice will not.
Verification Steps
- Hang up and call back using the person's real phone number from your contacts, not a number provided during the call
- Ask a personal question that only the real person would know, something not available on social media
- Listen for audio artifacts such as unusual pauses, robotic undertones, or inconsistent background noise
- Be skeptical of urgency since scammers pressure you to act immediately precisely because verification defeats them
- Contact other family members to confirm the story before sending any money
Digital Hygiene
Reduce the amount of voice data available for cloning:
- Set social media profiles to private
- Limit the amount of video and audio content you post publicly
- Be cautious about answering calls from unknown numbers (each call provides audio samples)
- Consider using a different voice or keeping responses short when answering unknown callers
- Check your phone number's exposure with ScamVerify's phone lookup
What to Do If You Are Targeted
- Do not send money. Hang up immediately and verify through a separate channel.
- Call the person directly using a number you already have saved.
- Report to the FBI's IC3 at ic3.gov if money was lost.
- Report to the FTC at ReportFraud.ftc.gov.
- Contact local law enforcement if you received a kidnapping ransom call.
- Report the phone number through ScamVerify's phone lookup to alert other potential victims.
For guidance on what to do if you already engaged with a scam caller, see our guide on what to do if you answered a scam call.
FAQ
Can AI clone my voice from a voicemail greeting?
Yes. A standard voicemail greeting provides 5 to 15 seconds of audio, which is more than enough for modern voice cloning tools. Some security experts recommend keeping voicemail greetings as short as possible or using a default system greeting. However, the most practical defense remains the family safe word, since even a perfect clone cannot produce information it does not have.
How can I tell if a call is using a cloned voice?
Current AI voice cloning has subtle tells: unusual breathing patterns, slight robotic quality during emotional speech, inconsistent background noise, and unnatural pauses when the attacker is typing their next line. However, the technology is improving rapidly and these artifacts are becoming harder to detect. Do not rely on your ability to detect a deepfake. Instead, verify identity through a callback or safe word.
Are there apps that can detect deepfake voices?
Several companies are developing deepfake voice detection tools, including Pindrop, Resemble AI, and Reality Defender. Some phone carriers are beginning to integrate AI voice detection into their call screening. However, no consumer tool is yet reliable enough to depend on as your sole defense. The safe word and callback verification remain more effective.
Is it illegal to clone someone's voice?
The legality varies by jurisdiction and use case. The FCC's 2024 ruling made AI voices in robocalls illegal under the TCPA. Several states have passed or proposed laws specifically addressing voice cloning for fraud. Using a cloned voice to commit fraud is illegal under existing wire fraud, identity theft, and impersonation statutes regardless of specific AI legislation.
How do scammers get my family member's voice?
Social media is the primary source. TikTok videos, Instagram stories, YouTube content, podcast appearances, conference recordings, and even LinkedIn video posts all provide usable audio. Phone calls to the target (even brief ones answered with "Hello?") can also capture enough audio. Public figures and active social media users are at higher risk because more audio samples are available.