Taylor Swift has become the subject of non-consensual deepfake pornography as sexually explicit images of the star flood social media platforms.
Deepfake pornography has risen to epidemic proportions, with more celebrities and regular folk featuring in non-consensual deepfake pornography.
On Wednesday, January 24th, sexually explicit AI-generated images of artist Taylor Swift flooded X.
Taylor Swift’s image, shared by a user on X, was viewed a staggering 47 million times before the user was suspended by the platform, according to The New York Times.
Despite best efforts to remove the explicit images, they managed to proliferate across different social media platforms.
Swift’s fans banded together, protesting the images and writing the words “Protect Taylor Swift.”
Recently, a deepfake of actor Jennifer Aniston was seen promoting a YouTube scam.
While many agreed that the video had plenty of tell-tale signs of being fake – from bizarre facial expressions to lip movements not entirely matching what was being said – others voiced apprehension at the increasing quality of such scams.
In 2023, there were almost 280,000 clearnet synthetic, non-consensual exploitative videos as per ‘The State of Deepfakes and the exponential rise of nonconsensual synthetic adult content in 2023.’
The duration of these videos was 1249 days, and the number of views amounted to over 4.2 billion, the report states.
Furthermore, the hours of videos posted on leading deepfake NCEI websites jumped from 9,300 to nearly 14,000.
The creation of non-consensual deepfake pornography can have profound effects on victims.
“Many victims have experienced depression, thoughts of suicide, PTSD, and sexual trauma after their images were falsified,” Sophie Maddocks said.
Even if the images are known to be fake, this type of content sets out to “sexually shame and humiliate victims who may struggle to reach a place of psychological safety,” as their images cannot be erased from the internet.
This harmful trend has already demonstrated far-reaching adverse effects on celebrities and real people and is likely to become even more problematic as the technology improves.
More from Cybernews:
Subscribe to our newsletter