SAF NATIONAL AI THREAT CENTER

Deepfake Video Calls Trick Executives into Authorizing Wire Transfers

CRITICALDeepfake Fraud
Global
Published: November 27, 2025
Source: FBI Cyber Division

Threat Overview

Corporate executives are falling victim to deepfake video calls that convincingly impersonate CEOs and senior management, resulting in fraudulent wire transfers.

A new wave of sophisticated fraud targeting businesses involves the use of deepfake video technology to impersonate company executives during video conference calls. Criminals are using publicly available footage of CEOs and senior executives to create realistic video deepfakes that can fool employees into authorizing fraudulent wire transfers. In several documented cases, finance department employees received what appeared to be legitimate video calls from their CEO or CFO requesting urgent wire transfers. The video quality, voice, mannerisms, and even background details were convincing enough to bypass standard verification protocols. One multinational company reported a loss of $25 million after a finance manager approved a wire transfer following a video call with what appeared to be the company CEO. The deepfake was so convincing that the employee had no reason to doubt the authenticity of the request. Security experts warn that the technology for creating these deepfakes is becoming more accessible and sophisticated. Companies are urged to implement multi-factor authentication for all financial transactions and establish out-of-band verification procedures for any requests involving money transfers, regardless of how authentic the communication appears.

Threat Indicators

  • A new wave of sophisticated fraud targeting businesses involves the use of deepfake video technology to impersonate company executives during video conference calls.
  • Criminals are using publicly available footage of CEOs and senior executives to create realistic video deepfakes that can fool employees into authorizing fraudulent wire transfers.
  • In several documented cases, finance department employees received what appeared to be legitimate video calls from their CEO or CFO requesting urgent wire transfers.
  • The video quality, voice, mannerisms, and even background details were convincing enough to bypass standard verification protocols.
  • One multinational company reported a loss of $25 million after a finance manager approved a wire transfer following a video call with what appeared to be the company CEO.

Recommended Actions

  • Verify video authenticity through multiple independent channels
  • Look for visual artifacts, inconsistent lighting, or unnatural movements
  • Never approve financial transactions based solely on video communication
  • Implement multi-factor authentication for all sensitive operations
  • Freeze accounts and contact financial institutions immediately if compromised

SAF Advisory

This briefing is part of the SAF National AI Threat Center public protection initiative. StopAiFraud.com provides these threat briefings to help citizens, businesses, and government agencies stay informed about emerging AI-powered fraud schemes.