SAF NATIONAL AI THREAT CENTER

AI Voice Cloning Used in Emergency Scams Targeting Families

CRITICALVoice Cloning
United States
Published: November 27, 2025
Source: FBI IC3

Threat Overview

Criminals are using AI voice cloning technology to impersonate family members in distress, demanding immediate money transfers for fake emergencies.

Sophisticated threat actors are leveraging advanced AI voice synthesis technology to clone the voices of loved ones, creating highly convincing emergency scenarios designed to extract money from victims. These scams typically involve a phone call from someone claiming to be a family member (child, grandchild, spouse) who has been in an accident, arrested, or is in urgent need of funds. The voice sounds identical to the real person, making it nearly impossible to detect the fraud based on voice alone. The scammers often create a sense of urgency and panic, instructing victims not to contact other family members and to wire money immediately. In many cases, victims have sent thousands of dollars before discovering the scam. Recent cases reported to the FBI Internet Crime Complaint Center (IC3) show losses exceeding $11 million from voice cloning scams in 2023 alone. The technology required to clone voices has become increasingly accessible, requiring only a few minutes of audio from social media posts, voicemails, or public videos.

Threat Indicators

  • Sophisticated threat actors are leveraging advanced AI voice synthesis technology to clone the voices of loved ones, creating highly convincing emergency scenarios designed to extract money from victims.
  • These scams typically involve a phone call from someone claiming to be a family member (child, grandchild, spouse) who has been in an accident, arrested, or is in urgent need of funds.
  • The voice sounds identical to the real person, making it nearly impossible to detect the fraud based on voice alone.
  • The scammers often create a sense of urgency and panic, instructing victims not to contact other family members and to wire money immediately.
  • In many cases, victims have sent thousands of dollars before discovering the scam.

Recommended Actions

  • Never act on urgent requests received via phone or email alone
  • Verify caller identity by calling back on official phone numbers found independently
  • Establish a "safe word" or verification phrase with family members
  • Be suspicious of any voice message requesting immediate money transfers
  • Report incidents to local law enforcement and FBI IC3 immediately

SAF Advisory

This briefing is part of the SAF National AI Threat Center public protection initiative. StopAiFraud.com provides these threat briefings to help citizens, businesses, and government agencies stay informed about emerging AI-powered fraud schemes.