Phishing on Steroids: How AI Supercharges Scams

Scam Prevention
Jessica Long

Jessica Long

|
4 min read
|
Published Feb 13, 2026

A notification flashes: Your account is being debited for an iPhone.

Seconds later, a call. When you answer, a trembling voice says, “Grandpa, it’s me.” 

 

The thing is, these days? Both could be fake. 

 

The threat is so real and so widespread, the U.S. Senate Special Committee on Aging put the warning into a plain language flyer you can share with family:  "Emerging Threat: Artificial Intelligence".

 

The same committee’s 2025 report, Age of Fraud: Scams Facing Our Nation’s Seniors, flags a jump in sophisticated scams that lean on artificial intelligence, and the FBI’s 2024 IC3 Annual Report reports $16.6 billion in losses tied to almost 900,000 complaints.

iStock illustration ID: 2236042877
Family emergency scams lean on voice cloning and deepfakes to impersonate a loved one in danger, pushing you toward an immediate transfer. 

When a Scam Learns to Imitate You

AI is far from magic… but it’s pretty good at imitation. The Aging Committee describes it as technology that lets machines mimic human-like behaviors, such as speech or writing. That “almost human” quality is exactly what scammers rent, like a costume pulled from a crowded closet.

 

Three tools show up again and again: chatbots can be used to coax, store, or manipulate personal data. Voice cloning can copy someone you know with just a small sample. Deepfakes can produce video or images that look authentic enough to trigger trust before your brain catches up. The committee’s advice is blunt for a reason: keep personal and sensitive information away from online chatbots.

The Three Acts: Phishing, Panic, and Romance

Remember the good ol’ days when phishing scams were sloppy? Now,  AI makes it personalized, fast, and… weirdly polite. The Senate report warns that fraudsters can use AI to build spear-phishing emails that imitate convincing dialogue and slip past traditional spam filters. Sometimes it’s just an email, and sometimes it is a “transaction update” text about a pricey purchase that tries to funnel you to a callback number.

 

Then comes the emotional hook. Family emergency scams lean on voice cloning and deepfakes to impersonate a loved one in danger, pushing you toward an immediate transfer. Romance scams use fake profiles and chatbots that keep the conversation flowing, day after day, until giving them money feels like “love.” The FTC has warned about this pattern, including its consumer alert on harmful voice cloning.

iStock photo ID: 2253063244
Romance scams use AI to keep the conversation flowing, day after day, until giving them money feels like “love.” 

Friction Wins: Your Anti-Fraud Routine

Scams thrive on speed. Your best defense is friction. Put a few deliberate speed bumps between the story and your wallet:

  • Stop and breathe. Urgency is a tactic.
  • Verify using a number you already have, not one in a text or email.
  • Consider a safe word that only close contacts know.
  • Do not transfer or send money to unknown locations.
  • Do not buy gift cards for “ransom” (even if – especially if – it’s “the Law”).
  • Report the scam to authorities and the company being impersonated.

If you think you have been targeted, get help while details are fresh. The Senate Aging Committee runs a toll free Fraud Hotline (1 855 303 9470, weekdays 9 a.m. to 5 p.m. Eastern). 

 

You can also report scams to the FTC at ReportFraud.ftc.gov and browse current scam guidance in the FTC’s scams library. For money-related scams, the CFPB’s fraud and scams tools can help you recognize and respond.

AI was used to assist our editors in the research of this article.
#deepfake
#phishing scams
#romance scams
#grandparents scam
#elder fraud
#scam reporting
#FTC ReportFraud
#Senate fraud hotline
#CFPB fraud resources