Criminals used sophisticated identity theft techniques and artificial intelligence to industrialize crypto fraud on an unprecedented scale. According to the latest Chainalysis report published on January 13, 2026, approximately $17 billion in cryptocurrencies were lost in 2025 due to scams and frauds, marking a significant shift in the nature of digital crime.
The Rise of Impersonation: 1,400% Growth in Crypto Crimes
Chainalysis data reveal a dramatic change in criminal tactics. Identity theft experienced a year-over-year growth of 1,400%, gradually displacing traditional cyberattacks as the primary vector for fund loss in the industry.
What is most concerning is that scammers have abandoned the “scatter and wait” approach to target more selective but significantly more lucrative targets. The average size of payments extracted through impersonation has increased considerably, reflecting an increasingly professionalized fraud industry focused on victims with greater financial capacity.
Artificial Intelligence: The Tool That Multiplies Fraud Profitability by 4.5
Artificial intelligence has become the ultimate accelerant for these criminal operations. AI-driven scams are 4.5 times more profitable than conventional scams, mainly because deepfakes and automated tools allow for mass-scale creation of convincing deception components: fake support agents, fraudulent government notifications, and “trusted informants” that appear completely legitimate.
This economic advantage explains why criminals have invested resources in automating and sophisticating their schemes. It is no longer just about poorly written generic messages, but highly personalized and contextually relevant operations that can deceive even diligent and cautious users.
From Vulnerable Code to False Trust: Why Social Engineering Wins
For years, the crypto industry emphasized technical security: smart contract audits, multi-signature wallets, validation protocols. However, Chainalysis has identified a fundamental shift: while hacks remain a threat (2.2 billion dollars were stolen via this method in 2024), the vulnerability now exploited by criminals is inherently human.
Social engineering surpasses any technical exploit because it targets trust, a factor no security patch can fully correct. Sophisticated schemes exploit the “fear” and “panic” experienced by users, especially in volatile markets, to manipulate decisions under pressure.
Testimonials: How Scammers Operate in Practice
The case of a man in the UK illustrates the severity of these attacks: he lost nearly $2.5 million in a Bitcoin scam in 2025, which North Wales police labeled as “a new worrying trend” in digital crime. To contextualize this figure: between 2020 and late 2023, nearly 100,000 Britons fell victim to investment scams totaling £2.6 billion (approximately $3.5 billion), which amounts to about £13 million weekly. Although these statistics only include reported complaints, the actual figures are presumably much higher.
Lior Aizik, co-founder and COO of the exchange platform XBO, personally observes how these tactics evolve: “Across the crypto industry, impersonation is increasing and becoming more sophisticated.” Aizik has been personally victimized by multiple impersonations, with criminals using his name and fake profile images to contact people within the sector requesting money while pretending to represent his company.
What is crucial in these attacks is not technological sophistication but the exploitation of urgency and trust. “These attacks rely on urgency and trust, not technology,” explains Aizik, emphasizing that criminals build crisis narratives (urgent transactions, wallet issues, surprise audits) to bypass users’ natural skepticism.
Protection: Understanding the Threat Is the First Step
Chainalysis’s findings represent a conceptual turning point for the industry. Cryptocurrency crime is no longer defined solely by code breaches or smart contract exploits but increasingly by scams that appear sufficiently real to overcome even the most diligent users’ psychological defenses.
Experts like Aizik offer practical recommendations: never share sensitive data even if you believe you’re talking to legitimate support staff, never transfer cryptocurrencies to third parties based on spontaneous requests, and treat any message that feels urgent or secretive with extreme suspicion. If a communication feels pressured or requests discretion, it is usually a warning sign of potential fraud.
Impersonation represents the new front in crypto crime because it reveals an uncomfortable truth: all wallets and exchanges can implement all the correct security measures, but no technical barrier can fully protect against manipulation of human trust. The industry continues to adapt to this reality.
This page may contain third-party content, which is provided for information purposes only (not representations/warranties) and should not be considered as an endorsement of its views by Gate, nor as financial or professional advice. See Disclaimer for details.
Impersonation: how scammers diverted $17 billion in cryptocurrencies in 2025
Criminals used sophisticated identity theft techniques and artificial intelligence to industrialize crypto fraud on an unprecedented scale. According to the latest Chainalysis report published on January 13, 2026, approximately $17 billion in cryptocurrencies were lost in 2025 due to scams and frauds, marking a significant shift in the nature of digital crime.
The Rise of Impersonation: 1,400% Growth in Crypto Crimes
Chainalysis data reveal a dramatic change in criminal tactics. Identity theft experienced a year-over-year growth of 1,400%, gradually displacing traditional cyberattacks as the primary vector for fund loss in the industry.
What is most concerning is that scammers have abandoned the “scatter and wait” approach to target more selective but significantly more lucrative targets. The average size of payments extracted through impersonation has increased considerably, reflecting an increasingly professionalized fraud industry focused on victims with greater financial capacity.
Artificial Intelligence: The Tool That Multiplies Fraud Profitability by 4.5
Artificial intelligence has become the ultimate accelerant for these criminal operations. AI-driven scams are 4.5 times more profitable than conventional scams, mainly because deepfakes and automated tools allow for mass-scale creation of convincing deception components: fake support agents, fraudulent government notifications, and “trusted informants” that appear completely legitimate.
This economic advantage explains why criminals have invested resources in automating and sophisticating their schemes. It is no longer just about poorly written generic messages, but highly personalized and contextually relevant operations that can deceive even diligent and cautious users.
From Vulnerable Code to False Trust: Why Social Engineering Wins
For years, the crypto industry emphasized technical security: smart contract audits, multi-signature wallets, validation protocols. However, Chainalysis has identified a fundamental shift: while hacks remain a threat (2.2 billion dollars were stolen via this method in 2024), the vulnerability now exploited by criminals is inherently human.
Social engineering surpasses any technical exploit because it targets trust, a factor no security patch can fully correct. Sophisticated schemes exploit the “fear” and “panic” experienced by users, especially in volatile markets, to manipulate decisions under pressure.
Testimonials: How Scammers Operate in Practice
The case of a man in the UK illustrates the severity of these attacks: he lost nearly $2.5 million in a Bitcoin scam in 2025, which North Wales police labeled as “a new worrying trend” in digital crime. To contextualize this figure: between 2020 and late 2023, nearly 100,000 Britons fell victim to investment scams totaling £2.6 billion (approximately $3.5 billion), which amounts to about £13 million weekly. Although these statistics only include reported complaints, the actual figures are presumably much higher.
Lior Aizik, co-founder and COO of the exchange platform XBO, personally observes how these tactics evolve: “Across the crypto industry, impersonation is increasing and becoming more sophisticated.” Aizik has been personally victimized by multiple impersonations, with criminals using his name and fake profile images to contact people within the sector requesting money while pretending to represent his company.
What is crucial in these attacks is not technological sophistication but the exploitation of urgency and trust. “These attacks rely on urgency and trust, not technology,” explains Aizik, emphasizing that criminals build crisis narratives (urgent transactions, wallet issues, surprise audits) to bypass users’ natural skepticism.
Protection: Understanding the Threat Is the First Step
Chainalysis’s findings represent a conceptual turning point for the industry. Cryptocurrency crime is no longer defined solely by code breaches or smart contract exploits but increasingly by scams that appear sufficiently real to overcome even the most diligent users’ psychological defenses.
Experts like Aizik offer practical recommendations: never share sensitive data even if you believe you’re talking to legitimate support staff, never transfer cryptocurrencies to third parties based on spontaneous requests, and treat any message that feels urgent or secretive with extreme suspicion. If a communication feels pressured or requests discretion, it is usually a warning sign of potential fraud.
Impersonation represents the new front in crypto crime because it reveals an uncomfortable truth: all wallets and exchanges can implement all the correct security measures, but no technical barrier can fully protect against manipulation of human trust. The industry continues to adapt to this reality.