Non-consensual AI deepfake child porn not explicit in law, senator says


An investigation into AI-generated deepfake pornography depicting 20 girls from Lancaster Country Day School has been launched by the local police department.

Several teenage girls who attend the school have been victims of artificially generated deepfake pornography after an unknown actor used the teens’ likeness to create sexually explicit images.

In a news report from WGAL, a news station based in Lancaster, Pennsylvania, County District Attorney Heather Adams refused to comment on specific details relating to the case. Yet, Adams did mention that these AI-generated images have been causing the girls distress.

ADVERTISEMENT

During the interview, Adams mentioned that “under our current child pornography statute, our courts have interpreted the definition of a child to mean an actual child.” Suggesting that those who generated nude images of the teenagers may not be breaking the law.

This brings into question the function of traditional laws relating to child pornography in the growing technological age.

“We have to look at our current existing laws and how the facts of a particular investigation fit into those current laws,” said Adams.

In cases like these, lawyers are sometimes unable to charge criminals with crimes related to non-consensual deepfake pornography. Adams said that whether or not a person can be charged depends on each individual case.

“It depends so much on the investigation and what we’re able to gather and then what we’re able to prove.”

This can be deeply frustrating for victims of non-consensual deepfake pornography as the lack of consequences could make them lose faith in the systems that are meant to protect and guide them.

One anonymous victim spoke out about the case, saying that they don’t feel protected by the government or their school.

Despite this, there is hope, as lawmakers are trying to make AI-generated porn illegal. One official trying to make a change is Senator Tracy Pennycuick, who has brought forward Senate Bill 1213.

ADVERTISEMENT

This bill aims to make the distribution and creation of non-consensual AI deepfake images illegal, as well as focus on changing the way that child pornography is defined in law.

“They’re (law enforcement) in a tough spot because it’s (AI generated child porn) not explicit in the law, said Senator Tracy Pennycuick.

If changed from child pornography to child sexual abuse material (CSAM) this would automatically include the use of AI to generate child porn.

Using AI to generate non-consensual pornography is becoming increasingly common, as more cases arise, particularly involving non-consensual deepfake pornography of children.

One US soldier was arrested for using AI chatbots to craft child sexual abuse material and faces a maximum of 20 years in prison.

Seth Herrera, 34, allegedly transported, received, and possessed media of children in violent sexual situations. Herrera also used AI chatbots to create images of children he knew.

Another case involved a Wisconsin man who was arrested for CSAM created using generative AI.

Steven Anderegg, 42, of Holmen, allegedly used the popular text-to-image model Stable Diffusion to generate sexually explicit images of minors.

A repeat sex offender was sentenced to almost 15 years in prison for possession of deepfake CSAM depicting child celebrities.

James Smelko, 57, “possessed and accessed pictures that digitally superimposed the faces of child actors onto nude bodies and bodies engaged in sex acts,” the Justice Department said.

Law enforcement discovered the pictures after searching Smelko’s computer, and he was charged with the possession of CSAM.

ADVERTISEMENT