AI voice cloning technology can replicate a person's voice from as little as a few seconds of audio. Scammers pull voice samples from social media videos, voicemail greetings, and public recordings, then use the cloned voice to call family members pretending to be a loved one in distress.
These attacks are called Telephone-Oriented Attack Delivery (TOAD) scams. They exploit the natural instinct to help someone you love, and because the voice sounds real, victims often act before they have time to think.
Americans lose over $3 billion annually to phone scams, and AI voice cloning is making these attacks more convincing than ever.
- 1
Extreme urgency
"I need money RIGHT NOW" — scammers pressure you to act before you can think.
- 2
Unusual payment methods
Gift cards, wire transfers, or cryptocurrency — methods that are impossible to reverse.
- 3
Requests for secrecy
"Don't tell anyone" — scammers isolate you from people who could help you see through the lie.
- 4
No safe word
The caller doesn't know your family safe word — or tries to dodge the question entirely.
- 5
Story falls apart
When you ask follow-up questions, the details don't add up or keep changing.
- 6
Discourages callback
The caller doesn't want you to hang up and call them back on a number you already have.
- 1
Stay calm
Don't act on emotion. Scammers rely on panic to override your judgment.
- 2
Hang up and call directly
Use a phone number you already have saved — never call back a number the caller gives you.
- 3
Use your family safe word
Establish a secret word or phrase that only your family knows. Ask for it on any suspicious call.
- 4
Never send money based on a call
No legitimate emergency requires gift cards, wire transfers, or cryptocurrency.
- 5
Report the scam
File a report at reportfraud.ftc.gov to help protect others.
$3B+
Lost to phone scams annually in the US
3 sec
Of audio needed to clone a voice with AI
77%
Of targets who lost money were contacted by phone
Trusted resources
- FBI Internet Crime Complaint Center (IC3)— Report internet-related crimes
- Federal Trade Commission (FTC)— File fraud reports
- AARP Fraud Watch Network— Scam alerts and prevention tips
Listen to an AI Clone
This is what AI voice cloning sounds like. The technology is real — and getting better every day. With just 3 seconds of audio from a social media video, scammers can clone any voice.
Demo audio — no real user data. Created by the Trustbox team for educational purposes.