Criminal versions of the AI large language learning model ChatGPT have been doing a brisk trade on the dark web and as you might expect, it’s a murky business.
KrakenLabs, cybersecurity firm Outpost24’s threat intelligence team, has spent recent months prowling the illicit corners of the internet to see what's on sale — and who's doing the selling.
Chief among the sellers noted by KrakenLabs during its investigation was the reductively named “Last,” sometimes spelled Laste, and “CanadianKingpin12,” formerly known as CanadianSmoker12.
Meanwhile, the foremost perversion of ChatGPT spotted doing the rounds appears to have been the lugubriously named WormGPT, a no-holds barred deviation from the original that will obligingly perform tasks that its ‘good’ sibling would normally refuse to carry out.
What’s in a name?
WormGPT appears to have been an early illicit adaptation of ChatGPT and was launched by Last on July 22nd. But it has by no means been the only one. FraudGPT followed in its wake, promoted days later by CanadianKingpin12, and DarkGPT, DarkBERT, and DarkBARD were also spotted in the wilds of the dark web around the same time.
If one CanadianKingpin12 advertisement last month — promoting a “new and exclusive bot designed for fraudsters, hackers, spammers, [and] like-minded individuals” — is anything to go by, dark versions of ChatGPT will obligingly write malicious code and scam pages, create hacking tools, phishing pages, and undetectable malware, and find system leaks and vulnerabilities.
The prices charged for such services vary widely, as they tend to do on illicit digital black market forums, with Last offering a month’s ‘subscription’ to WormGPT for $100, annual usage for $550, and a “private setup” for $5,000.
FraudGPT cost less for one month, with CanadianKingpin12 charging just $90 for the privilege, but a year’s subscription was costlier than Worm, priced at $700.
Meanwhile, DarkBERT, DarkBARD, and DarkGPT were spotted by KrakenLabs offering lifetime membership for $1,250, $1,000, and $200 respectively.
“Threat actors are constantly looking for new ways or paths to achieve their goals, and the use of artificial intelligence is one of these novelties that could drastically change the underground ecosystem,” said Outpost24, commenting on the findings of its research team. “The cybercrime community will see this new technology either as a business model (developers and sellers) or as products to perpetrate their attacks (buyers).”
Too much, too soon?
Curiously enough, however, Last does appear to have been a victim of its own success. On August 9th, it announced the end of WormGPT via its dedicated channel on Telegram, a platform favored by cybercriminals and others who wish to avoid having their communications scrutinized — although that doesn’t appear to have deterred KrakenLabs.
Citing too much publicity as a reason for hanging up the black hat, Last declared: “From the beginning, we never thought we would gain this level of visibility, and our intention was never to create something of this magnitude.”
KrakenLabs puts a slightly different slant on it. Said visibility led to the group being outed by noted cybersecurity analyst Brian Krebs, who “published an article covering WormGPT and revealing the real name and nationality of the individual behind Last.” Perhaps in this case the cybercriminals simply decided to quit while they were ahead.
Whatever the truth, KrakenLabs notes that Telegram channels controlled by Last and DarkStux, a linked entity, closed down the day the announcement was made.
“A tool quickly achieving popularity is not always good, as with a quick rise also comes an increased chance of something going wrong,” said Outpost24. “It could either be that the infrastructure used becomes the target of a DDoS attack or even that researchers focus their investigations on the users behind its creation [who] end up being doxed and exposed.”
No honor among thieves
One other side-scam noted by KrakenLabs during its probe was a much simpler affair: that of placing bogus adverts offering AI-enabled illegal digital tools, taking payment, and never delivering the promised articles.
“The hype surrounding AI tools has also attracted many scammers who have set up websites and Telegram channels to deceive people into purchasing nonexistent access to those tools,” said Outpost24. “The rise in these scams is even more indicative of these emergent crime-enabling AIs’ popularity and interest from the underground community.”
In point of fact, even Last itself admitted in its farewell message that “anyone could reproduce what WormGPT did,” as Outpost24 put it.
“At the end of the day, WormGPT is nothing more than an unrestricted ChatGPT,” said Last. “Anyone on the internet can employ a well-known jailbreak technique and achieve the same, if not better, results by using jailbroken versions of ChatGPT. In fact, being aware that we utilize GPT-J 6B as the language model, anyone can utilize the same uncensored model and achieve similar outcomes to those of WormGPT.”
So, who’s the sucker now?
More from Cybernews:
Chatbot preachers and robo sermons: will AI replace religion or become one itself?
AI investors defrauded out of $500K by startup founder
The New York Times bans use of its content to train AI
Discord.io suffers data breach, goes offline
Another US Congressman reveals emails hacked by China
Subscribe to our newsletter
Your email address will not be published. Required fields are markedmarked