It starts with a familiar voice. Maybe it’s your president, your cousin, or your local sheriff—urging you to act fast. But what if that voice isn’t human at all?

With artificial intelligence now able to mimic anyone from Joe Biden to your grandmother, phone scams are entering a disturbing new chapter. Fortunately, a major regulatory shift is trying to push back. But will it be enough to protect consumers?

Let’s break down what’s changed—and what you still need to watch out for.

 

The FCC Strikes Back: AI Robocalls Now Illegal

In a landmark ruling, the Federal Communications Commission officially banned AI-generated robocalls across the U.S. This follows a disturbing wave of scams where AI-generated voices—sometimes cloned from public figures—were used to deceive voters, impersonate loved ones, or threaten innocent people with arrest.

One such scam made headlines in 2024 when a synthetic voice mimicking President Biden urged New Hampshire voters to skip the state’s primary electionThe message sounded real. It wasn’t. It was part of a voter suppression effort using deepfake technology to sow confusion during a critical time.

In the FCC’s press release about the ruling, FCC Chairwoman Jessica Rosenworcel stated:

 "Bad actors are using AI-generated voices in unsolicited robocalls to extort vulnerable family members, imitate celebrities, and misinform voters. We’re putting a stop to these scams." 

This move marks a major step forward—but here’s the reality: scammers don’t need AI to be convincing.

 

In 2024, a synthetic voice mimicking President Biden urged New Hampshire voters to skip the state’s primary election (img src: DALL-E)

 

Old Tricks, New Technology: Why Some Scams Still Work

Just ask the Georgia woman who received a call from a man claiming to be Sergeant Matthew King with the Sheriff’s Office. The voice was human. The pitch was practiced. And the story? Chillingly familiar.

He told her she missed jury duty. That she faced immediate arrest. That a $2,500 payment—via gift cards, of course—would resolve everything.

“It was very believable,” she later told TrustDALE. Fortunately, she didn’t fall for it.

According to consumer protection experts, the use of fear, urgency, and authority are still some of the most effective tools in a scammer’s armory—no AI required. Law enforcement impersonation scams like this one remain disturbingly common, even as robocall bans take effect. In fact, the FBI recently released a statement warning consumers about these scams.

 

How to Protect Yourself from Voice Scams—AI or Not

While it’s encouraging to see regulators stepping in, the truth is: laws don’t stop criminals—awareness does.

Here are three easy ways to protect yourself from voice scams:

  • Let unknown calls go to voicemail. If it’s truly important, the caller will leave a message. If they don’t, it likely wasn’t urgent.
  • Verify urgent claims. If someone calls saying they’re a loved one in trouble, hang up and call them back using the number you already know.
  • Gift cards = guaranteed scam. No government agency or reputable business will ever ask for payment via gift cards or wire transfers.

And one more thing: law enforcement will never call you to collect a fine. Ever.

 

The Bottom Line: Stay Smart, Stay Skeptical

Artificial intelligence may be changing how scams sound, but the tactics remain the same—urgency, fear, pressure. Whether it’s an AI-voiced politician or a fake sheriff threatening arrest, the goal is to manipulate you into acting before thinking.

You don’t need to be tech-savvy to stay safe. You just need to pause, question, and verify.

If you’ve been targeted by a phone scam, report it immediately to the FCCthe FBI’s Internet Crime Complaint Center or your state’s Attorney General’s office. 

Have you received a suspicious call lately? What tipped you off—or almost fooled you? Share your experience and help others stay alert.