Binance Co-Founder CZ Warns Of AI-Powered Deepfake Hacks That Can Fool Video Verification

Attackers use AI deepfakes in video calls to trick victims into installing malware disguised as legitimate updates. Crypto professionals must verify unexpected requests via separate channels and never download software from unofficial links.

More articles

Meghna Chowdhury
Meghna Chowdhury
Meghna is a Journalism graduate with specialisation in Print Journalism. She is currently pursuing a Master's Degree in journalism and mass communication. With over 3.5 years of experience in the Web3 and cryptocurrency space, she is working as a Senior Crypto Journalist for UnoCrypto. She is dedicated to delivering quality journalism and informative insights in her field. Apart from business and finance articles, horror is her favourite genre.

Changpeng Zhao, the former chief of Binance, has sounded the alarm on a new wave of deepfake hacking powered by artificial intelligence. He warned that even video call verification will soon be ineffective. 

Zhao urged users not to install software from unofficial links, even if they appear to come from friends. He pointed out that those “friends” have often had their accounts hijacked by attackers using sophisticated AI tricks.

Influencer Falls Victim

Earlier this week, Japanese crypto influencer Mai Fujimoto shared her own case. On June 14, her main X(Twitter) account was compromised. Fujimoto said she fell for a clever deepfake while on a Zoom call.

The fake host’s face looked real, but the audio kept cutting out. When she clicked an “update” link to fix the sound, her computer was infected. Despite reporting the hack to the platform, her account remains active under the attacker’s control, and she is still trying to regain access.

Sophisticated Zoom Deception

Fujimoto explained that the attacker used her acquaintance’s stolen Telegram account to send the meeting link. Once she joined, the video feed showed the familiar face, and all seemed normal. 

She only noticed something was wrong when she could not hear the person speaking. The link she clicked to adjust audio settings secretly installed malware. In just ten minutes, her system was taken over without her realising the danger.

North Korean–Linked Deepfake Campaign

This kind of attack is not isolated. Security researchers report that BlueNoroff, a group tied to North Korean intelligence, carried out a similar deepfake scheme. 

In that case, an employee at a crypto foundation spent weeks in video meetings seeing AI‑generated versions of their own executives. After a microphone glitch, the victim was prompted to download a browser extension. 

That extension installed a keylogger, a screen recorder and an information stealer designed to harvest crypto credentials. These campaigns show how threat actors are targeting remote workers in the digital asset sector.

Also Read: Crypto Investigator Uncover Phishing Attacks Cloning Zoom Software Aiming To Target Crypto Projects

Growing Risks for Remote Workers

Deepfake attacks combine social engineering with cutting‑edge AI. Attackers first gain trust by mimicking known faces. Then they push malware disguised as genuine software updates. 

Once inside a system, the malware can capture passwords, private keys and other sensitive data. Workers in the crypto space are prime targets because a single stolen key can mean millions of dollars in losses. Even tech‑savvy professionals can be duped when they see a familiar face on screen.

Protecting Against Deepfakes

To stay safe, experts say never to click links from unexpected messages, even if they seem to come from friends or colleagues. Always verify by other means, such as a quick phone call or a separate messaging channel. 

Use official websites and trusted app stores for software updates. Enable multi‑factor authentication on all accounts to add an extra layer of defence. Keeping operating systems and security tools up to date can also block known malware.

The rise of AI‑powered deepfake attacks marks a new chapter in cybercrime. As Changpeng Zhao warned, video calls alone can no longer confirm someone’s identity. Crypto holders and remote workers must remain vigilant and question any unexpected prompts, no matter how convincing.

Also Read: Binance Co-Founder Changpeng Zhao Shares Critical Security Guidelines Aimed at Safeguarding Crypto Users

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Latest