Orders created at night time will be transferred to processing at 9:00 Kyiv


100 60


With the advancement of artificial intelligence technologies, the crypto industry has encountered new threats. The application of AI opens up vast opportunities for scammers to deceive and manipulate, as confirmed by a recent report from Elliptic. In this article, we will explore the main typologies of AI use for criminal purposes, real fraud cases, and recommendations for counteraction.

Main AI-Based inventions by scammers

Experts identify five main methods of AI use by crypto scammers:

  • Generation of Deepfakes and Scam Advertising Materials: AI is used to create realistic deepfakes of famous personalities, such as Elon Musk and Brad Garlinghouse, to promote fraudulent cryptocurrency airdrops.
  • Market Manipulation: Includes launching AI tokens, investment platforms, Ponzi schemes, and romantic scams aimed at luring victims into crypto investment schemes through online communication.
  • Using Large Language Models for Hacking: AI helps detect code vulnerabilities and develop exploits, enabling scammers to access crypto platforms and wallets.
  • Scaling Crypto Scams and Disinformation: AI tools are used to create fake websites, marketing materials, and fake publications in well-known media.
  • Creating Fake Documents on Darknet Markets: Services like OnlyFake Document Generator 3.0 offer fake documents for passing KYC procedures on crypto platforms.

Examples of real fraud cases

Examples of real cases confirm experts’ concerns:

  1. Account Hack on OKX: An OKX user lost over $2 million due to a data leak and the use of a deepfake to change security settings. Hackers hacked his Telegram account, gained access to his email, and used a deepfake to change the account settings on the exchange.
  2. Romantic Scams: In one scheme, scammers used AI to create attractive characters to lure victims into crypto investment schemes. Victims were involved in long-term online communication, after which they were convinced to invest in fake projects.
  3. Fake KYC Documents: Fake documents are sold on darknet markets, allowing users to pass KYC procedures on crypto platforms. For example, the OnlyFake Document Generator 3.0 service offers packages of fake IDs.
  4. Case with Binance: A Binance client lost $1 million due to a data leak and the subsequent use of deepfakes. Hackers created a deepfake of the client to pass KYC and gain access to his funds.

Experts recommend developers to consider potential criminal applications of AI and ensure the security of solutions, and law enforcement agencies to use AI and blockchain to detect fraud. Users should be attentive to risk signs.

Users should take care of the security of their funds and pay attention to the unpleasant experiences of colleagues.

Back to news
Operator offline
20.07.2024, 00:22