Skip to main content
Mallory
Mallory

AI-Enabled Romance Scams Using Deepfakes and Fake Cryptocurrency Lures

romance scamscryptocurrency fraudfake cryptocurrencydeepfakesface-swappingfinancial theftai impersonationdating appssocial engineeringbot conversationspsychological manipulationcelebrity impersonationoverseas syndicatesvoice synthesisvideo calls
Updated February 11, 2026 at 06:03 PM2 sources
AI-Enabled Romance Scams Using Deepfakes and Fake Cryptocurrency Lures

Get Ahead of Threats Like This

Know if you're exposed — before adversaries strike.

A surge in romance scams is leveraging AI-enabled impersonation to make fraud harder to detect, combining manufactured intimacy with financial theft. Australian police warned more than 5,000 people they may have been targeted in a large-scale operation linked to overseas syndicates, where scammers used mainstream dating apps to initiate relationships and then steered victims into purchasing fake cryptocurrency. The playbook described includes rapidly escalating emotional commitment, isolating targets, and pushing conversations off-platform to apps like WhatsApp or Telegram, reducing victims’ access to in-app safety controls and reporting mechanisms.

The fraud techniques increasingly rely on deepfakes and automated “AI personas,” undermining traditional verification methods such as requesting custom photos or relying on video calls as proof of identity. Reported tactics include real-time face-swapping and AI voice synthesis during video calls, long-running bot-driven conversations that build trust over months, and “celebrity” impersonation to intensify emotional leverage and extract larger payments. Despite the technology shift, the core mechanism remains psychological manipulation—using scripted narratives and social engineering to move victims from online rapport to off-platform communication and ultimately to financial transfers.

Related Stories

Emergence of AI-Driven Romance Scams and Crypto Phishing Threats

Emergence of AI-Driven Romance Scams and Crypto Phishing Threats

New research has revealed that romance scams are increasingly being automated through the use of large language models (LLMs), allowing cybercriminals to scale their operations and make scam interactions more convincing. These scams typically follow a three-stage process: initial contact, relationship building, and financial extraction, with LLMs now handling much of the repetitive conversation and persona management. Insiders from scam operations report daily use of AI tools to draft and translate messages, making it easier to maintain multiple simultaneous conversations and deceive victims into fraudulent cryptocurrency investments. In parallel, the threat landscape for cryptocurrency users has intensified, with phishing attacks targeting digital wallets and decentralized applications (dApps) on the rise. According to a 2025 Kaspersky report, crypto-related phishing detections surged by over 80% compared to 2023, with social engineering scams accounting for the largest share of incidents. Attackers employ tactics such as fake wallet sites, approval phishing, and payload-based transaction phishing, resulting in hundreds of millions of dollars in losses. These developments underscore the growing sophistication and automation of social engineering attacks in the cryptocurrency ecosystem, driven by advances in AI and the expanding use of digital assets.

2 months ago
AI-Enabled Fraud Scams Industrialized by Transnational Criminal Networks

AI-Enabled Fraud Scams Industrialized by Transnational Criminal Networks

**Transnational criminal networks** are increasingly industrializing online fraud with **AI-enabled social engineering**, according to reporting on scam compounds in Southeast Asia, an Interpol assessment, and policy commentary tied to a new US executive order. Fraud operations linked to *pig-butchering* and romance scams are using generative AI to improve language quality, deepfakes to impersonate trusted people, and low-cost "deepfake-as-a-service" offerings to scale deception. Interpol said AI-assisted fraud is **4.5 times more profitable** than non-AI schemes, while broader reporting describes these operations as structured, multinational enterprises that function like businesses and increasingly rely on automation, synthetic identities, and persuasive impersonation at scale. Reporting from Cambodia and the wider region shows scam operators are now recruiting "**AI face models**" to appear on high-volume deepfake video calls, including applicants from multiple countries seeking work in compounds associated with trafficking-linked fraud operations. The same ecosystem has been described as part of a broader organized-crime model involving forced labor, cryptocurrency investment scams, romance fraud, and impersonation schemes targeting victims globally. One reference on calculating AI ROI in enterprise cybersecurity is **not about this fraud campaign ecosystem**, and an EU sanctions announcement concerns separate state-linked cyber incidents rather than financially motivated AI-enabled fraud.

Today
Chainalysis Reports Surge in Crypto Scams Driven by Impersonation and AI-Enabled Fraud

Chainalysis Reports Surge in Crypto Scams Driven by Impersonation and AI-Enabled Fraud

Chainalysis reported that **cryptocurrency scams and fraud generated an estimated $17B in victim losses in 2025**, making it the largest year on record in its tracking, with at least **$14B observed on-chain** and expectations that totals will rise as additional illicit addresses are identified. The report attributes the increase to the continued industrialization of scam operations and infrastructure, including *phishing-as-a-service*, AI-generated deepfakes, and professional money-laundering networks, alongside major scam categories such as **pig butchering/romance scams** and HYIP-style schemes. Chainalysis also assessed that scam efficiency increased materially, citing a **253% YoY rise in average scam payment** (from **$782 in 2024** to **$2,764 in 2025**) and noting that **AI-enabled scams** can be significantly more profitable than traditional approaches. A key driver highlighted was the rapid growth of **impersonation scams**, which Chainalysis said rose roughly **1,400% YoY**, with average payments to those clusters up more than **600%**. One example cited was an **E‑ZPass-themed smishing campaign** that used fake toll-payment texts and lookalike sites to deceive victims; Chainalysis linked this activity to the Chinese-speaking group **“Darcula” / “Smishing Triad,”** and referenced reporting and legal action describing tooling and templates used to scale these lures. Separately, reporting on **AI deepfake impersonation** shows similar social-engineering dynamics outside of “crypto-only” contexts, including deepfakes impersonating religious figures to solicit donations and promote fraudulent crypto-related offers, reinforcing the report’s broader finding that **AI-assisted impersonation** is increasing the reach and credibility of scams.

2 months ago

Get Ahead of Threats Like This

Mallory continuously monitors global threat intelligence and correlates it with your attack surface. Know if you're exposed — before adversaries strike.