Security

AI is speeding up crypto website fraud and influencer scams

AI’s early integration into crypto presents opportunities for efficiency, as well as scams. Per an Elliptic report, AI is helping scammers speed up and increase their cyclical crypto scams.

We’ve all heard of AI by now, and its many use cases. Some use cases are legitimate and helpful for crypto ventures, while others allow bad actors and scammers to prey on victims. AI can enhance the legitimate crypto companies by generating text, images, websites, videos, and other content. It also opens new avenues for illicit activities.

According to a recent Elliptic report, AI can speed up the deployment and sustainability of crypto scams, posing a substantial threat to the credibility of Artificial Intelligence and the crypto industry.

Scams involving social media influencers and crypto websites

Crypto influencers who leverage their following and personality to prompt specific crypto products can wield their influence to manipulate prices in several ways. Influencers often use a tactic known as “pump and dump,” in which they promote a particular cryptocurrency to increase its price artificially. Influencers prompt followers to invest, driving up the price. They then sell, causing it to plummet, profiting at the expense of their followers.

Influencers can use their platforms to spread exaggerated or false claims, causing fear and panic selling in the market. Then, they can buy cryptocurrencies at lower prices.

Like influencers, crypto scammers operate cyclically. They create scam investments, airdrops, or giveaway sites that are widely promoted on social media and messaging apps. Once these sites gain enough traction and controversy, scammers execute a “rug pull,” disappearing with the victims’ funds and leaving them empty-handed. The process is cyclical, so after the rug pull, they often start again with a new token, a different site and new marketing strategies.

Creating all these scam documents — from social media accounts to websites and user interfaces — takes a lot of time, is resource-intensive and costly. AI tools can now significantly streamline this process.

For instance, per the Elliptic report, AI can generate fake employee images and other marketing materials, making scams appear more legitimate and professional. This not only saves time but also enhances the sophistication of the scams, making them harder to detect.

In order to make as much money as possible, scammers need to establish scam operations and connect with a large number of people. Typically, they have used social media bots to spread fake marketing and messages on a large scale. However, with AI, scammers can automate and streamline social media posts and set up the required infrastructure for their efficient distribution, taking this approach to the next level.

Scam examples

Per one Reddit user, a person named Jessica, presumably an AI-powered bot, claimed to offer huge returns through AI-powered crypto trading after the user clicked on an investment ad. After a series of requests for increasing Ethereum (ETH) under the guise of fees and investments, the user gave Jessica over $1300. The bot promised returns of over $6000 but never returned the gains. Jessica disappeared after receiving a final payment.

Another example involves NovaDrainer, a crypto affiliate platform based in Canada and the UK that offers a scam-as-a-service that creates crypto investment sites for affiliates, splitting the proceeds. It claims to use AI for processing tokens and generating SEO-optimized website designs. Despite promoting legitimate projects, it openly markets its use for phishing and draining victims’ crypto.

Over the past year, the platform has received 2,400 variants of crypto tokens from over 10,000 wallets, likely scam victims. Elliptic’s analysis reveals that NovaDrainer employs a complex cross-chain obfuscation strategy involving decentralized exchanges, cross-chain bridges, and coin swap services to manage the stolen funds. However, the transactions are not automated through AI.

Prevention measures and regulations

Addressing these challenges requires a proactive approach. The crypto industry must invest in advanced security measures and collaborate with AI developers to create technologies capable of detecting and countering scams. Additionally, raising awareness among crypto users about the potential risks and educating them on identifying and avoiding scams is crucial.

The elliptic report recommends using the DECODE (detect, educate, cooperate, defend, enforce) framework for mitigating emerging crime trends.

Detect

  1. Use blockchain analytics to identify payments to AI-related illicit services
  2. Use AI-enhanced blockchain analytics to detect crime

Educate

  1. Raise awareness among users of crypto and AI on both existing and recent red-flag indicators of scams
  2. Educate users and employees on methods to identify deep fakes

Cooperate

  1. Data sharing to expand the capabilities of relevant stakeholders to mitigate AI-enhanced crypto crime
  2. Share best practices across stakeholders

Defend

  1. Ensure that new AI and crypto technologies are crime-proofed during development
  2. Equip compliance teams

Enforce

  1. Prioritize interventions against illicit services experimenting with AI
  2. Ensure that new and fast-paced innovations in AI are integrated with capacity building and training

Source

Click to rate this post!
[Total: 0 Average: 0]
Show More

Leave a Reply

Your email address will not be published. Required fields are marked *