Load WordPress Sites in as fast as 37ms!

House passes bipartisan bill to combat explicit deepfakes, sending it to Trump to sign into law


The House passed a bipartisan bill Monday aimed at combating deepfake pornography, tackling a sensitive issue that has become a growing problem amid advances in artificial intelligence.

President Donald Trump is expected to sign the measure, which sailed through the House in a 409-2 vote, into law. 

The “Take it Down Act” would criminalize publishing nonconsensual, sexually explicit images and videos — including those generated by AI — and require platforms to remove the content within 48 hours of notice. The Senate passed the legislation by unanimous consent earlier this year.

Passing the bill is a rare legislative feat for Congress, which has been notoriously slow to keep up with the pace of technology.

And the effort attracted broad bipartisan support: Sens. Ted Cruz, R-Texas, and Amy Klobuchar, D-Minn., are the lead sponsors of the bill in the Senate, while first lady Melania Trump has used her platform to help champion it. President Trump touted the bill in a speech before Congress last month.

“Today’s bipartisan passage of the Take It Down Act is a powerful statement that we stand united in protecting the dignity, privacy, and safety of our children,” Melania Trump said in a statement. “I am thankful to the Members of Congress — both in the House and Senate — who voted to protect the well-being of our youth.

Through this critical legislation and our continued focus with ‘Be Best,’ we are building a future where every child can thrive and achieve their full potential,” said First Lady Melania Trump.

Still, it has drawn some opponents. Digital rights groups have raised concerns that the measure as currently drafted could threaten free speech and privacy rights. 

Rep. Thomas Massie, R-Ky., cast one of the two “no” votes against the bill Monday, calling it “a slippery slope, ripe for abuse, with unintended consequences.”

Deepfake pornography and harassment have become more prevalent, including in schools, as AI has gotten more advanced. But no federal laws explicitly ban its use, making it harder to get images removed or hold people who create and distribute them accountable. 

One of the inspirations for the legislation was Texas teenager Elliston Berry, who woke up one morning during her freshman year of high school to learn that a male classmate had created a fake nude image of her and circulated it on social media.  

“I was completely shocked,” Berry, who was 14 at the time of the incident, said in an interview. “It wasn’t considered child pornography, although it completely is. So we were in a gray zone. No one knew what to do. School didn’t know what to do. The local sheriff didn’t know what to do.” 

The images were eventually taken down, but Berry and her mother felt there were still glaring holes in the law and found that accountability was difficult to achieve. So they connected with Cruz, one of their home-state senators, to work on a legislative fix aimed at preventing the same thing from happening to other victims. Berry was also Melania Trump’s guest for the president’s March address.

“The last thing I wanted to do was talk about it, but it’s been super healing and super encouraging knowing that I’m able to have these opportunities to speak about this and to protect so many people and to be that voice,” Berry said.

Check Also

Attorneys dispute Trump officials’ claim that deported moms willingly took their U.S. citizen children

One mother who was about to be deported was allowed less than two minutes on the …

The Ultimate Managed Hosting Platform
If you purchase through these links, I may earn a commission at no additional cost to you.