Newly passed Take It Down Act will make posting deepfake revenge porn a federal crime


The US Senate has unanimously passed the Take it Down Act, a new bill aimed at protecting victims of AI-generated revenge porn, making it illegal to publish deepfakes online.

The bi-partisan bill, introduced and co-written by Senators Ted Cruz (R-TX) and Amy Klobuchar (D-MN) in June, will give social media companies and other websites just 48 hours to remove deepfake imagery when reported by the victim.

The Take it Down legislation (S. 4569) “will give innocent victims – many of whom are teenage girls – the opportunity to seek justice against deviants who publish these abusive images, Cruz said about the bill.

ADVERTISEMENT

“It will also hold big tech accountable by making sure websites remove these disgusting fake videos and pictures immediately,” he added.

Although the bill has moved through the Senate, Cruz points out that Take It Down still needs to pass in the House for the bill to become law in the US.

Specifically, S. 4569 would criminalize the publication of all non-consensual intimate imagery (NCII), including imagery generated by artificial intelligence.

Additionally, the law would force social media and other sites to develop official procedures to remove the content within two days of notice from a victim.

The bill further clarifies that even if a victim consents to the creation of a pornographic image of themselves, it does not mean that the victim has consented to its publication.

No federal law on the books

ADVERTISEMENT

Currently, each of the fifty US states have their own specific legislation explicitly covering deepfake NCIIs, instead of one over-arching federal law.

This can create issues for the victim as each state’s laws vary in the classification of crime and penalty and have uneven criminal prosecution, Cruz’s brief states.

Victims have also struggled to have the unsavory fake images removed from websites, “increasing the likelihood the images are continuously spread and victims are retraumatized.”

Commonly referred to as “deepfake revenge pornography,” the content can include any illicit images, videos, or audio recordings manipulated with AI.

“The results can appear incredibly realistic. Many people cannot tell which parts of the manipulated videos or photos are real and which are fake,” according to Security.org’s 2024 Deepfakes Guide and Statistics.

What’s more, besides reputational and emotional harm, these pornographic deepfakes can be used for nefarious purposes such as disinformation, blackmail, harassment, and financial fraud, it said.

Statistics show that deepfake fraud incidents have “increased tenfold between 2022 and 2023.”

vilius Ernestas Naprys Paulina Okunyte Paulius Grinkevicius
Get our latest stories today on Google News

In 2022, Congress passed legislation allowing civil suits against a perpetrator, yet successful prosecution can be difficult for victims due to the lengthy and time-consuming process, different jurisdictions, expenses, and the potential anonymity of those responsible for the crime.

Big tech names, dating apps, and non-profits supporting the bill include Google, Microsoft, Meta, TikTok, IBM, Bumble, Match Group, RAINN (Rape, Abuse & Incest National Network, National Center for Missing and Exploited Children (NCMEC), and the Entertainment Software Association for the video game industry.

ADVERTISEMENT

The bill was received by the House on December 4th, the day after it passed in the Senate, and is currently being “held at the desk,” which means it is now available for immediate consideration.

“For young victims and their parents, these deepfakes are a matter requiring urgent attention and protection in law. I will continue to work with my colleagues in Washington to move this common-sense bipartisan legislation quickly through the House and to the President’s desk so it can be signed into law,” Cruz said.