Google and Bing placing nonconsensual explicit deepfake content at top of search results – media


Nonconsensual deepfake pornography is regularly found at the top of search engine results on Google and Microsoft’s Bing, NBC News reports.

Nonconsensual deepfake pornography is the process of using someone's likeness without their consent by digitally manipulating images to make it appear as if someone is engaged in a sexual act.

Often, the faces of celebrities will be swapped with those of porn actors engaging in sexual activities.

NBC News found that deepfake pornographic images, which featured the likenesses of female celebrities, were among the first images to surface on popular search engines such as Google and Bing.

Typing in the names of female celebrities and keywords such as “deepfake,” “deepfake porn,” or “fake nudes” produced various results for the news outlet.

NBC News combined the word “deepfakes” with 36 popular female celebrity names on Google and Bing.

Google generated results for 34 nonconsensual deepfake videos and images, whereas Bing produced 35 results.

Furthermore, when Googling “fake nudes,” the results revealed various applications and programs where individuals could create and observe nonconsensual deepfake pornography – these were among the first six results on Google NBC News reports.

When searching “fake nudes” on Bing, the result produced various nonconsensual deepfake tools and websites.

NBC News found that before any news results explaining the harmful nature of nonconsensual deepfakes, there were various tools and websites where users could view and create this type of pornographic content themselves.

Google and Bing have a system in which victims of deepfake porn can request the content be removed by filling in a form.

However, NBC News highlights how search engines like Google and Microsoft’s Bing don’t appear to be proactively surveying their search engines for abuse.

The pressure on social media platforms to combat nonconsensual deepfake adult content has been mounting, given the recent surge in such imagery. Twitch found itself at the center of attention after its streamer, Brandon “Atrioc” Ewing, accidentally revealed that he was watching sexually explicit deepfakes of his fellow streamers during a live stream. While Twitch insisted that the problem doesn't plague the platform itself, it introduced policy changes to address the issue.

Evidently, deepfake pornography has grown into something of an epidemic. According to Maddocks’ report titled 'The State of Deepfakes and the Exponential Rise of Nonconsensual Synthetic Adult Content in 2023,' there were almost 280,000 clearnet synthetic, non-consensual exploitative videos in 2023.

The hours of videos posted on leading deepfake nonconsensual explicit image (NCEI) websites have jumped from 9,300 to nearly 14,000.


More from Cybernews:

OpenAI removes ban on military and warfare applications

Carnegie Mellon University suffers a cyberattack

Researchers show that thermostats can go rogue, keeping you cool while spying

Waste heat from supercomputer used to warm Scottish homes

Racing team hits reverse amid controversy over AI-created female reporter

Subscribe to our newsletter



Leave a Reply

Your email address will not be published. Required fields are markedmarked