Muah.ai, a build-your-own-artificial intelligence (AI) girlfriend website, has suffered a data breach, exposing the twisted fantasies and over1.9 million records.
One hacker was looking to engage in some adult activity, or, as he put it, to “jerk off,” when he stumbled across a site that looked vulnerable.
After tinkering with things a bit, the hacker found that the site was insecure, stumbling across a goldmine of personal information and some extremely disturbing data.
But where did he find this data? On a porn site or some dodgy forum? Well, he found it on a not-safe-for-work chatbot site, where people can make their own AI girlfriends.
Muah.ai is a chatbot that works like your own personalized AI girlfriend. It sounds good at first.
Finally, you’ve found a platform that can satisfy your sexual fantasies, a site that can also be interactive and personal.
But there’s a problem. Nothing is impenetrable, and all data, particularly sensitive data such as this, can be stolen.
That’s exactly what happened to Muah.ai when a hacker found a database full of 1.9 million records, including email addresses and prompts used to generate AI images, according to Have I Been Pwned.
“I went to the site to jerk off (to an *adult* scenario, to be clear) and noticed that [Muah.ai] looked like it was put together pretty poorly,” the hacker told 404 Media.
These included your run-of-the-mill sexual fantasies, but some were far more twisted.
According to 404 Media, who first reported the story and saw the data taken from Muah.ai, the prompts used made chatbots roleplay sexual fantasies involving children.
The outlet reported that one individual prompted the chatbot to create an incestuous infant orgy featuring newborn babies and young children. Whether Muah.ai fulfilled this request is unknown as 404 Media and the hacker couldn’t confirm whether the request went through. Other requests involved incest and the abuse of young children.
Those who prompted the illicit sexual fantasies may have a difficult time after this hack, as identifiable email addresses linked to real people were found.
Supposedly, the attack on Muah.ai was orchestrated by chatbot competitors in the uncensored AI industry, one of Muah.ai’s administrators, Havard Han, told 404 Media.
Your email address will not be published. Required fields are markedmarked