First lady pushes Congress to pass Take it Down Act to make deepfake revenge porn illegal


America’s first lady, Melania Trump, appeared before House lawmakers on Monday urging for the final passage of the Take it Down Act, in which posting intimate deepfake imagery without that person’s consent would become a federal crime.

In her first solo appearance since President Donald Trump was sworn in on January 20th, the first lady held a roundtable discussion with lawmakers, online safety advocates, and several victims of the heinous practice.

Commonly referred to as “deepfake revenge pornography,” the content can include any illicit images, videos, or audio recordings manipulated with AI.

ADVERTISEMENT

“BE BEST: On my way to The Hill to advocate for the Take It Down Act bill. I urge Congress to pass this important legislation to safeguard our youth,” the first lady posted on X.

Support for the Take it Down Act ties into the revival of Mrs. Trump’s “Be Best” online safety initiative for children, which was launched during Trump’s first presidency without much fanfare.

Passed unanimously by the US Senate this past December, the Take it Down Act requires any social media platform and similar websites to remove deepfake imagery when reported by the victim within 48 hours.

"In today’s AI-driven world, the threat of privacy breaches is alarmingly high, and the misuses of personal information escalates,” Mrs. Trump spoke from the Mansfield Room on Capitol Hill.

“It is imperative we safeguard children from mean-spirited, hurtful, online behavior. Ensuring their protection is not just a responsibility, but a vital step in nurturing tomorrow’s leaders,” she said.

The First Lady then spoke of the "brave" 15-year-old teen, Elliston Berry, whose schoolmates superimposed the young girl's face over pornographic images, posting the deepfakes across social media, which ultimately spread worldwide.

Calling it “heartbreaking,” “overwhelming,” and “damaging” for Berry and all young girls, the First Lady explained that the family had contacted the social media companies for help, “but their efforts fell on deaf ears.”

ADVERTISEMENT
First lady and Take it Down Act victim
US First Lady Melania Trump speaks during a roundtable discussion on the "Take It Down Act" in Washington, D.C. on March 03, 2025. The "Take It Down Act" expands protections for victims of non-consensual sharing of sexual images, covering AI-generated content including deepfake pornography. Kayla Bartkowski/Getty Images

What's in the bill

Currently, each of the fifty US states has its own specific legislation explicitly covering deepfake NCIIs, instead of one overarching federal law.

Specifically, SB. 4569 would criminalize the publication of all non-consensual intimate imagery (NCII), including imagery generated by artificial intelligence.

Additionally, the federal law would force social media and other sites to develop official procedures to remove the content within two days of notice from a victim.

The bill further clarifies that even if a victim consents to the creation of a pornographic image of themselves, it does not mean that the victim has consented to its publication.

Besides reputational and emotional harm, pornographic deepfakes can be used for nefarious purposes such as disinformation, blackmail, harassment, and financial fraud, according to Security.org’s 2024 Deepfakes Guide and Statistics.

Facebook and Instagram owner Meta Platforms, a supporter of the bill, even helped to create a Take it Down portal for kids under 18 years old in partnership with the National Center for Missing and Exploited Children (NCMEC) to help combat the rise of teen sextortion scams.

Security.org statistics show that deepfake fraud incidents have increased tenfold between 2022 and 2023.

ADVERTISEMENT

“The results can appear incredibly realistic. Many people cannot tell which parts of the manipulated videos or photos are real and which are fake,” the website said.

When victims struggle or fail to have their deepfake images removed from websites, it “increases the likelihood the images are continuously spread and victims are retraumatized,” the bill’s co-sponsor, US Senator Ted Cruz (R-TX), said in December.

Cruz and fellow lawmakers Rep. Maria Salazar (R-FL), and Speaker of the House Mike Johnson (R-LA) also took part in the roundtable discussions.