The phone rings.

It's your daughter. She's crying. There's been an accident — she needs you to wire money, right now, and she can't talk long.

You'd know her voice anywhere.

That's the problem.


What's actually happening

The voice on the phone is not your daughter. It's an AI model that has listened to roughly thirty seconds of her voice - pulled from a TikTok video, a voicemail greeting, a wedding speech somebody put on Facebook - and learned to imitate her.

The technology to do this used to require a recording studio and a research team. Now it runs on a laptop. Some versions run in the cloud and clone a voice in real time, so the scammer can hold a conversation in your daughter's voice while you're on the call.

This is not a future problem. The FBI and FTC have both issued warnings about it. The cases that make the news are the ones where someone wired tens of thousands of dollars before realizing.


Why it works on smart people

Three things, all happening at once:

A voice you trust completely.

A situation designed to short-circuit your judgment - an accident, an arrest, a hospital. Something where hesitation feels cruel.

And a time pressure that makes verifying feel like an insult. "Mom, please, just send it."

That combination is the entire scam. The voice is the hook. The emergency is the lever. The clock is the close.

If you slow any one of those three down, the whole thing falls apart.


The pattern to listen for

  • The call comes from a number you don't recognize, or the person explains why they're calling from a different phone
  • They ask for money - wire transfer, gift cards, a payment app, or cryptocurrency
  • They tell you not to call anyone else, or not to tell the rest of the family
  • The story is urgent enough that taking five minutes to verify feels wrong

Any one of those is a red flag. All four at once is the scam.


The one move that stops it

Hang up. Call them back on the number you already have for them.

That's it. That's the whole defense.

If it really is your daughter, she'll answer her own phone or text you back in two minutes. If it isn't, the scammer can't pick up the call you make to her real number.

Don't try to test the caller with a question like "what's the dog's name?", because a sophisticated scammer has already looked at the family Facebook page and knows. The test is calling back on a number you trust, not asking a question the scammer might have answered for.


A 60-second family conversation worth having tonight

Tell anyone who might be the target of one of these calls -parents, grandparents, kids old enough to answer the phone - the same two sentences:

"If anyone ever calls in a panic asking for money, even if they sound exactly like me, hang up and call me back on my regular number. I won't be offended. I will be relieved."

That conversation is more protective than any piece of software you can buy. The scam relies on you not doing it.


This week on the blog I'm looking at three different stories that look unrelated at first - voice cloning, the Canvas hack, and a piece of code most people have never heard of called Axios. They're all the same story underneath. Trust is the new attack surface, and the way you verify is what matters.

Friday's newsletter pulls all of it into a one-page family cheat sheet. If you're not on the list yet, this is a good week to fix that.

Subscribe to the PCRescue weekly →

If you've already had a call like this and want a second set of eyes on what happened - or you just want to lock down the accounts and recordings that make these scams possible - I can help.

Request a callback → | Schedule a remote session →

That Voice on the Phone Wasn't Who You Thought

AI can clone someone's voice from thirty seconds of audio. Here's what the scam actually sounds like — and the one move that stops it cold.