
The websites used AI to generate nude images of women and teens and, in some instances, children.
The San Francisco City Attorney's office has sued 16 websites that create and distribute non-consensual AI-generated pornography in a first-of-its-kind lawsuit.
The unnamed websites allowed users to upload photos of real individuals, applying AI to undress them and create pornographic images, explained David Chiu, San Francisco City attorney, in a press conference on Thursday.
One website marketed its services, saying “Imagine wasting time taking her out on dates when you can use website x to get her nudes.”
“Images are created without the consent of people depicted in them and often are indistinguishable from real photos. Some sites create images of women, while others also of children,” added Chiu in a briefing for media.
Collectively, the websites were visited over 200 million times during the first six months of this year. They impacted various celebrities, such as Taylor Swift, and children's schools nationwide.
For example, 16 images from Beverly Hills Middle School circulated among teenagers this year. Chiu said similar instances occurred in California, Washington, and New Jersey.
Representatives of the San Francisco City Attorney's office underlined that often, such images are used to bully, humiliate, and threaten women and girls. They also contributed to several extortion schemes, which, in some instances, caused individuals to become suicidal.
The Attorney’s office claims that operators of these websites broke several laws banning deepfake pornography, revenge pornography, and child pornography.
Three Open-source AI models were used to generate these images, including older versions of Stable Difusion.
Your email address will not be published. Required fields are markedmarked