Chainalysis Reports Surge in Crypto Scams Driven by Impersonation and AI-Enabled Fraud
Chainalysis reported that cryptocurrency scams and fraud generated an estimated $17B in victim losses in 2025, making it the largest year on record in its tracking, with at least $14B observed on-chain and expectations that totals will rise as additional illicit addresses are identified. The report attributes the increase to the continued industrialization of scam operations and infrastructure, including phishing-as-a-service, AI-generated deepfakes, and professional money-laundering networks, alongside major scam categories such as pig butchering/romance scams and HYIP-style schemes. Chainalysis also assessed that scam efficiency increased materially, citing a 253% YoY rise in average scam payment (from $782 in 2024 to $2,764 in 2025) and noting that AI-enabled scams can be significantly more profitable than traditional approaches.
A key driver highlighted was the rapid growth of impersonation scams, which Chainalysis said rose roughly 1,400% YoY, with average payments to those clusters up more than 600%. One example cited was an E‑ZPass-themed smishing campaign that used fake toll-payment texts and lookalike sites to deceive victims; Chainalysis linked this activity to the Chinese-speaking group “Darcula” / “Smishing Triad,” and referenced reporting and legal action describing tooling and templates used to scale these lures. Separately, reporting on AI deepfake impersonation shows similar social-engineering dynamics outside of “crypto-only” contexts, including deepfakes impersonating religious figures to solicit donations and promote fraudulent crypto-related offers, reinforcing the report’s broader finding that AI-assisted impersonation is increasing the reach and credibility of scams.
Related Entities
Organizations
Sources
Related Stories

AI-Enabled Fraud Scams Industrialized by Transnational Criminal Networks
**Transnational criminal networks** are increasingly industrializing online fraud with **AI-enabled social engineering**, according to reporting on scam compounds in Southeast Asia, an Interpol assessment, and policy commentary tied to a new US executive order. Fraud operations linked to *pig-butchering* and romance scams are using generative AI to improve language quality, deepfakes to impersonate trusted people, and low-cost "deepfake-as-a-service" offerings to scale deception. Interpol said AI-assisted fraud is **4.5 times more profitable** than non-AI schemes, while broader reporting describes these operations as structured, multinational enterprises that function like businesses and increasingly rely on automation, synthetic identities, and persuasive impersonation at scale. Reporting from Cambodia and the wider region shows scam operators are now recruiting "**AI face models**" to appear on high-volume deepfake video calls, including applicants from multiple countries seeking work in compounds associated with trafficking-linked fraud operations. The same ecosystem has been described as part of a broader organized-crime model involving forced labor, cryptocurrency investment scams, romance fraud, and impersonation schemes targeting victims globally. One reference on calculating AI ROI in enterprise cybersecurity is **not about this fraud campaign ecosystem**, and an EU sanctions announcement concerns separate state-linked cyber incidents rather than financially motivated AI-enabled fraud.
Today
Surge in Deepfake-Driven Fraud and Synthetic Identity Threats
Artificial intelligence-powered scams, particularly those leveraging deepfakes and synthetic identities, escalated significantly in 2025. Experts warn that the quality and volume of deepfakes have reached a level where they are nearly indistinguishable from authentic media for most people, enabling fraudsters to deceive victims on a global scale. Voice cloning and visual deepfakes have been used to facilitate large-scale scams, while the emergence of synthetic entities has further blurred the line between real and fake identities, complicating fraud detection for financial institutions. The misuse of stablecoins and lax cryptocurrency oversight have created new avenues for cross-border fraud, with experts predicting these trends will intensify in 2026. Industry leaders emphasize the urgent need for improved data, reporting, and regulatory measures to counteract these evolving threats. The rapid proliferation of generative AI tools has enabled "pig butchering" scams and other fraud operations to target vast populations, underscoring the growing risk posed by synthetic media and AI-driven deception in the financial sector and beyond.
2 months ago
AI-Enabled Romance Scams Using Deepfakes and Fake Cryptocurrency Lures
A surge in **romance scams** is leveraging **AI-enabled impersonation** to make fraud harder to detect, combining manufactured intimacy with financial theft. Australian police warned more than 5,000 people they may have been targeted in a large-scale operation linked to overseas syndicates, where scammers used mainstream dating apps to initiate relationships and then steered victims into purchasing **fake cryptocurrency**. The playbook described includes rapidly escalating emotional commitment, isolating targets, and pushing conversations off-platform to apps like *WhatsApp* or *Telegram*, reducing victims’ access to in-app safety controls and reporting mechanisms. The fraud techniques increasingly rely on **deepfakes** and automated “AI personas,” undermining traditional verification methods such as requesting custom photos or relying on video calls as proof of identity. Reported tactics include real-time face-swapping and AI voice synthesis during video calls, long-running bot-driven conversations that build trust over months, and “celebrity” impersonation to intensify emotional leverage and extract larger payments. Despite the technology shift, the core mechanism remains psychological manipulation—using scripted narratives and social engineering to move victims from online rapport to off-platform communication and ultimately to financial transfers.
1 months ago