Crypto scams is increasing – The continuous advancement of artificial intelligence (AI) technology has sparked increasing worries about the potential for scammers to exploit it for fraudulent purposes.
Social media platforms have become a prominent avenue for scammers to employ AI techniques. Through the use of AI-powered tools, scammers can extend their influence and fabricate a seemingly devoted following composed of numerous individuals.
These fabricated accounts and interactions serve to create an illusion of credibility and popularity around their fraudulent endeavors.
In some instances, scammers even utilize AI-driven chatbots or virtual assistants to engage with individuals, offering investment advice, promoting counterfeit tokens and initial coin offerings, or presenting high-yield investment opportunities.
The implementation of AI technology challenges the notion of social proof-of-work, which assumes that crypto projects with larger and more devoted online followings are legitimate.
Given the increased ease with which AI enables projects to deceive people, it is crucial for users to exercise caution and conduct thorough research before investing in any project.
An example of how scammers exploit AI technology is through “pig butchering” scams. AI algorithms can spend considerable time building relationships with individuals, often targeting vulnerable or elderly persons, only to eventually defraud them.
Crypto Scams – AI Enables Automated and Expanded Illicit Activities
The progress made in AI technologies has empowered scammers to automate and expand their illicit activities, potentially preying on vulnerable individuals within the realm of cryptocurrency.
Additionally, by leveraging social media platforms and AI-generated content, scammers can orchestrate intricate pump-and-dump schemes, artificially inflating the value of tokens and subsequently profiting from their holdings, leaving numerous investors at a loss.
Investors have long been warned about the risks associated with deepfake crypto scams, which employ AI technologies to create highly realistic online content that alters faces in videos and photos, or manipulates audio content to give the impression of endorsements from influencers or well-known personalities.
The Federal Bureau of Investigation (FBI) recently issued a stern warning regarding the escalating threat of “deepfakes” being utilized in cyber extortion.
According to reports, malicious actors employ deepfakes to manipulate images or videos, often sourced from social media accounts or the public internet, to create deceptive sexually explicit content.
In a similar vein, Twitter recently suspended the account of a popular meme coin-associated AI bot named “Explain This Bob” after Elon Musk denounced it as a “scam.”
The automated Twitter account “Explain This Bob” utilized OpenAI’s latest large multimodal model, GPT-4, to comprehend and respond to tweets from users who tagged the account.
While AI can offer positive applications in the cryptocurrency industry, such as streamlining tedious aspects of crypto development, users must remain vigilant and exercise caution when considering investments in new projects.
“Cybercriminals employ AI in crypto scams, deploying sophisticated bots impersonating family members, raising concerns about the security of the crypto industry and fostering a skeptical mindset,” highlighted a Twitter user, GarageID.