
President Donald Trump has signed the Take It Down Act into law, criminalising the distribution of nonconsensual intimate images, including those generated through artificial intelligence. The legislation mandates online platforms to remove such content within 48 hours of a victim’s request, with enforcement overseen by the Federal Trade Commission.
The Act, formally titled Tools to Address Known Exploitation by Immobilizing Technological Deepfakes on Websites and Networks, received overwhelming bipartisan support, passing the House with a 409-2 vote and the Senate unanimously. First Lady Melania Trump, who has been an advocate for online safety, co-signed the bill during a White House ceremony, marking a first in U.S. legislative history.
The law imposes penalties of up to three years in prison and fines up to $50,000 for those who knowingly publish or threaten to publish intimate images without consent. It also requires platforms to make reasonable efforts to remove duplicates of the offending content.
Supporters of the legislation include major technology companies such as Meta, Google, and Microsoft, as well as advocacy groups focused on combating online abuse. Senators Ted Cruz and Amy Klobuchar, who introduced the bill, cited cases like that of Elliston Berry, a teenager whose AI-generated explicit images circulated online, as impetus for the law.
However, the Act has faced criticism from digital rights organisations, including the Electronic Frontier Foundation and the Center for Democracy and Technology. Concerns have been raised about potential overreach, the vagueness of the law’s language, and the possibility of misuse to suppress legitimate content or infringe on free speech. Critics also point to the lack of an appeals process and safeguards against false takedown requests.
The legislation’s effectiveness may be limited by its focus on public-facing platforms, potentially leaving private forums and encrypted networks unaddressed. Additionally, the reactive nature of the takedown mechanism means that harmful content could still spread widely before removal.