In a rare show of bipartisan agreement, President Donald Trump signed the Take It Down Act into law on Monday, marking the first time the federal government will directly regulate the distribution of nonconsensual explicit imagery—including deepfakes and revenge porn—online.
The new legislation imposes criminal penalties for sharing sexually explicit photos or videos without the subject’s consent, regardless of whether the content is real or AI-generated. Violators can face fines, jail time, and mandatory restitution to victims.
Tighter Rules for Social Media Platforms
Under the Take It Down Act, social media companies and internet platforms are now required to remove flagged content within 48 hours of a victim’s notice. The law also mandates that platforms actively work to remove any duplicate or derivative versions of the offending material.
While many states have already passed laws banning revenge porn and AI-generated sexually explicit content, this is the first time the federal government has taken formal action. It signals a major shift in how online content is governed—and how platforms are held accountable.
President Trump, speaking during a signing ceremony at the White House, said the law will help stop online abuse before it ruins lives. “This will be the first-ever federal law to combat the distribution of explicit imagery posted without subjects’ consent,” he said. “We will not tolerate online sexual exploitation.”
Behind the Bipartisan Push
The bill gained support from both parties. First Lady Melania Trump publicly lobbied for its passage, while Senators Ted Cruz (R-Texas) and Amy Klobuchar (D-Minn.) co-sponsored the legislation.
Cruz said the issue became personal after learning that Snapchat took nearly a year to remove an AI-generated deepfake of a 14-year-old girl. The inaction by tech companies, he said, showed a gap in the legal system that Congress needed to address.
While the law has been praised by victims’ advocacy groups and anti-exploitation organizations, some digital rights advocates are raising red flags. Critics argue the legislation is overly broad and could be used to censor legal adult content or suppress free speech under vague standards.
The American Civil Liberties Union (ACLU) and other digital rights groups say clearer definitions and safeguards are needed to prevent misuse and unintended consequences—especially in cases where content might be educational, satirical, or consensual but misreported.
Still, for now, the Take It Down Act represents a milestone in regulating harmful content powered by AI and deepfake technologies—and in setting new standards for how tech platforms respond to digital exploitation.