A recidivist sex offender has been sentenced to almost 15 years in prison for possession of deepfake child sexual abuse material (CSAM) depicting child celebrities.
The evidence shows that James Smelko, 57, “possessed and accessed pictures that digitally superimposed the faces of child actors onto nude bodies and bodies engaged in sex acts,” the Justice Department said.
Law enforcement discovered the pictures after searching Smelko’s computer and he was charged with the possesion of CSAM.
While awaiting trial, Smelko violated his conditions of pretrial release, by accessing CSAM after “incriminating web searches and images were detected by court-mandated monitoring software installed on his cell phone.”
In November 2023, Smelko was convicted of one count of posessing child pornography and one count of accessing with the intention of viewing child pornography.
The conviction comes at a time where organizations are scrambling to manage deepfake CSAM.
In 2023, child sexual abuse material (CSAM) was located in LAION, a major data set used to train AI.
The Stanford Internet Observatory revealed thousands of images of child sexual abuse in the LAION-5B data set, which supports many different AI models.
The report shows that AI models such as Stable Diffusion and Google’s Imagen “were trained on billions of scraped images in the LAION-5B dataset.” This dataset is said to have been created through “unguided crawling that includes a significant amount of explicit material.”
These images have allowed AI systems to produce realistic and explicit images of imaginary children while also altering images of clothed individuals into nude photos.
In certain areas of the world, adults were unaware that AI generated CSAM was illegal.
A recent survey revealed that 40% of people thought that AI-generated child sexual abuse material was legal in the UK, demonstrating a shocking trend that requires immediate action by governments and independent organizations.
Your email address will not be published. Required fields are markedmarked