Most dangerous crypto scams: deepfakes, social engineering, and modern ponzis


Deepfake impersonation, social engineering scams, and modern ponzi schemes are deemed to be the most dangerous crypto scam types in a new anti-scam research report.

The report, co-authored by the crypto exchange Bitget, security specialist SlowMist, and blockchain analysis company Elliptic, noted that scams now exploit trust and psychology as much as technology.

For example, deepfakes use synthetic videos to promote fake investments; social engineers utilize Trojan offers, phishing bots, and fake staking, or offers to earn yield on your crypto assets, while modern ponzi schemes are wrapped in decentralized finance (DeFi), non-fungible tokens (NFTs), and blockchain-powered gaming branding.

ADVERTISEMENT

According to the report, in 2024, almost 40% of high-value frauds involved deepfake technology. Among the typical scenarios in which AI-generated deepfakes are used are celebrity videos promoting fraudulent investments, KYC (know your customer) verification bypassing, virtual identity investment scams, and Zoom phishing.

deepfake-face-artificial

For example, Tesla's boss, “Elon Musk,” frequently appeared in fake investment giveaway schemes. Meanwhile, in order to bypass KYC verification, scammers are using AI to forge facial videos, combining deepfake techniques with victims’ photos to create dynamic images that can even respond to voice commands, the report said.

Additionally, scammers impersonate Zoom by sending fake meeting invitations with malicious links to trick people into downloading trojan-infected "meeting software." Moreover, scammers use deepfake videos to impersonate executives or technical experts, creating trust and tricking their victims into sending them funds.

"Deepfake technology is becoming a critical component in the AI-driven scam ecosystem. In the AI era, the credibility of visual and auditory content has significantly declined," the report said, urging users to verify any "authoritative information" related to asset operations through multiple channels.

At the same time, crypto teams need to establish a sole trusted channel for information distribution or use on-chain signature broadcasting for identity verification, to help fight AI-enabled crime.

vilius Stefanie Konstancija Gasaityte profile Izabelė Pukėnaitė
Don’t miss our latest stories on Google News

Moreover, the AI hype is being used as a new way to trick victims into believing they're presented with ChatGPT-generated solutions.

ADVERTISEMENT

For example, in a video, the scammer claims the arbitrage bot’s code is generated using ChatGPT and can help users deploy profitable crypto strategies. However, after the code with backdoor logic is deployed, victims are told to inject startup funds into the smart contract address, while the criminals imply that "the more you invest, the higher the returns." Right after the funds are deposited, they're transferred to scammers.

"Five years ago, avoiding scams meant 'don’t click suspicious links.' Today, it’s 'don’t trust your own eyes,'" the report concluded, stressing the need for cooperation and "collective defense" in the crypto industry.