By a vote of 409–2, the House passed the Take It Down Act, a bill aimed at fighting deepfake porn—nonconsensual sexual content created using artificial intelligence. The legislation makes it illegal to create or share explicit deepfake images or videos without the person’s permission.
Under the law, online platforms must remove flagged content within 72 hours of notification. Victims will also gain the right to sue creators, distributors, or platforms that ignore takedown requests. Lawmakers argue this is long overdue, given how rapidly AI-generated imagery is advancing.
The bill has drawn praise as a historic step to protect digital privacy and human dignity. It enjoys bipartisan support and has backing from former President Trump, an unusual coalition in today’s political climate.
Advocates highlight that children, women, and public figures are disproportionately harmed by deepfake porn. Such content often leads to serious social and psychological consequences for victims.
“This is about drawing a line,” said Rep. Sheila Jackson Lee (D-TX), one of the bill’s sponsors. “Nobody ought to wake up to find their face on a fake pornographic video that has gone viral online without their consent.”
Only two lawmakers opposed the bill, expressing concerns over free speech and potential government overreach. Supporters argue the bill strikes a careful balance between platform accountability and privacy rights.
The Take It Down Act is expected to reach the Senate in the coming weeks, with strong bipartisan momentum and executive support making its passage likely.
If enacted, this law would fundamentally change how the U.S. combats digital exploitation and the misuse of artificial intelligence, offering stronger protections for victims and clearer responsibilities for online platforms.