SAF NATIONAL AI THREAT CENTER

AI-Powered Phone Systems Impersonate Banks to Steal Credentials

ELEVATEDBanking Fraud
United States
Published: November 27, 2025
Source: FDIC

Threat Overview

Sophisticated AI-driven interactive voice response systems are impersonating major banks, tricking customers into revealing account credentials and one-time passwords.

Financial institutions are warning customers about a new generation of phone-based phishing attacks that use AI-powered interactive voice response (IVR) systems to impersonate legitimate bank customer service lines. These systems sound remarkably authentic, complete with appropriate hold music, menu options, and professionally voiced prompts. Victims typically receive text messages or emails claiming there is suspicious activity on their account and providing a phone number to call. When they call, they reach an AI-powered system that sounds identical to their actual bank's phone system. The system walks them through "security verification" steps that trick them into providing account numbers, PINs, social security numbers, and one-time passwords. What makes these scams particularly dangerous is the sophistication of the AI voice systems. They can handle natural language queries, respond to customer questions, and adapt their responses based on the conversation flow. Some systems even simulate transfer to a "fraud specialist" (another AI voice or human scammer) who provides additional false reassurance. Banking regulators report that losses from these AI-impersonation scams have surged 300% in the past year. Customers are urged to never call phone numbers provided in unsolicited messages, and instead use official contact numbers from their bank's website or the back of their credit/debit cards.

Threat Indicators

  • Financial institutions are warning customers about a new generation of phone-based phishing attacks that use AI-powered interactive voice response (IVR) systems to impersonate legitimate bank customer service lines.
  • These systems sound remarkably authentic, complete with appropriate hold music, menu options, and professionally voiced prompts.
  • Victims typically receive text messages or emails claiming there is suspicious activity on their account and providing a phone number to call.
  • When they call, they reach an AI-powered system that sounds identical to their actual bank's phone system.
  • The system walks them through "security verification" steps that trick them into providing account numbers, PINs, social security numbers, and one-time passwords.

Recommended Actions

  • Never share banking credentials, OTPs, or personal information via phone or email
  • Verify all banking communications by contacting your bank directly using official channels
  • Enable multi-factor authentication on all financial accounts
  • Monitor account activity daily for unauthorized transactions
  • Report suspicious banking activity to your financial institution

SAF Advisory

This briefing is part of the SAF National AI Threat Center public protection initiative. StopAiFraud.com provides these threat briefings to help citizens, businesses, and government agencies stay informed about emerging AI-powered fraud schemes.