- Deepfakes are being used by scammers to impersonate executives and trick employees into transferring company funds. A finance worker recently lost $26 million in an elaborate video call scam using AI to mimic executives.
- According to Hong Kong police, the scammers likely used executive videos to generate deepfakes with artificial voices, allowing them to impersonate multiple people simultaneously. Several arrests have been made but it’s unclear if they relate to this specific crime.
- To avoid falling victim, verify suspicious meeting invites through official channels and ask detailed questions during meetings to validate identities. Overall, exercise caution even in group video calls, as AI enables realistic impersonation of multiple people.
Deepfakes are increasingly being used by scammers to impersonate executives and dupe employees into transferring funds. This highlights the need for greater vigilance around new deception tactics involving AI technology.
The Scam Unfolds via Video Conference
A finance worker at a major multinational corporation recently fell victim to an elaborate scam using deepfake video and audio of executives. The scammers invited the employee to a video call supposedly to discuss a confidential transaction. Convinced by the realistic AI-generated likenesses of colleagues, the worker made 15 transactions totaling around $26 million to accounts controlled by the criminals.
Hong Kong Police Investigate Sophisticated Operation
According to Hong Kong cyber security officials, the scammers likely downloaded videos of the executives in advance to create deepfakes with artificial voices. The technology allowed them to impersonate numerous individuals simultaneously during the call. Police have made arrests related to similar scams, but it’s unclear if any pertain to this specific crime.
Recommendations to Avoid Falling Victim
To avoid being duped, police advise verifying suspicious meeting invites through official channels instead of the initial message. Also, ask detailed questions during meetings to validate identities. Overall, exercise caution even in group video calls, as AI enables fraudsters to convincingly impersonate multiple people.
Deepfakes Pose Growing Threats
Experts warn that 2024 may be the year of the deepfake. The technology is already being used for identity theft and could wreak havoc during the upcoming US election. As deepfakes become more advanced and ubiquitous, individuals and organizations must remain vigilant against their potential misuse in scams and disinformation campaigns. Proactive measures and skepticism will be key to avoiding manipulation and fraud through synthetic media.