SAF NATIONAL AI THREAT CENTER

AI-Enhanced Biometric Spoofing Threatens Mobile Banking Security

ELEVATEDBiometric Fraud
Global
Published: November 27, 2025
Source: NIST Cybersecurity

Threat Overview

Advanced AI techniques are being used to spoof fingerprint and facial recognition systems, compromising mobile banking and authentication security.

Cybersecurity firms have documented a concerning rise in AI-enhanced biometric spoofing attacks targeting mobile banking applications and authentication systems. Criminals are using artificial intelligence to create sophisticated spoofs of fingerprints, facial features, and voice patterns that can bypass security systems. These attacks involve gathering biometric data from high-resolution photographs (available on social media), public videos, or even latent fingerprints left on surfaces. AI algorithms then generate synthetic biometric data or create physical spoofs (fake fingerprints, 3D-printed face masks) that can fool many consumer-grade biometric scanners. In documented cases, attackers have used AI-enhanced spoofing to gain unauthorized access to mobile banking apps, cryptocurrency wallets, and secure corporate systems. The attacks are particularly concerning because many users consider biometric authentication to be more secure than passwords and may not have additional security measures in place. One financial institution reported that attackers used AI-generated facial spoofs combined with stolen phones to access accounts, transferring funds before victims realized their devices were compromised. The sophistication of these AI-enhanced spoofs can defeat many liveness detection systems that are designed to distinguish real biometrics from fake ones. Security experts recommend using multi-factor authentication that combines biometrics with other verification methods, such as PIN codes, security tokens, or behavioral analytics. Financial institutions are working to implement more advanced anti-spoofing technologies, but the AI-enhanced attack methods are evolving rapidly.

Threat Indicators

  • Cybersecurity firms have documented a concerning rise in AI-enhanced biometric spoofing attacks targeting mobile banking applications and authentication systems.
  • Criminals are using artificial intelligence to create sophisticated spoofs of fingerprints, facial features, and voice patterns that can bypass security systems.
  • These attacks involve gathering biometric data from high-resolution photographs (available on social media), public videos, or even latent fingerprints left on surfaces.
  • AI algorithms then generate synthetic biometric data or create physical spoofs (fake fingerprints, 3D-printed face masks) that can fool many consumer-grade biometric scanners.
  • In documented cases, attackers have used AI-enhanced spoofing to gain unauthorized access to mobile banking apps, cryptocurrency wallets, and secure corporate systems.

Recommended Actions

  • Verify all urgent requests through independent, trusted communication channels
  • Never share sensitive personal or financial information without verification
  • Enable multi-factor authentication on all important accounts
  • Document suspicious communications and report to appropriate authorities
  • Report suspicious activity to relevant government agencies

SAF Advisory

This briefing is part of the SAF National AI Threat Center public protection initiative. StopAiFraud.com provides these threat briefings to help citizens, businesses, and government agencies stay informed about emerging AI-powered fraud schemes.