The TAKE IT DOWN Act Becomes Law: What It Means for Survivors of Domestic Violence
/Following overwhelming bipartisan support in both the U.S. House of Representatives and Senate, today President Donald Trump signed into law the TAKE IT DOWN Act, a landmark federal measure aimed at addressing the nonconsensual distribution of intimate imagery (NDII) (often historically referred to in public discourse as “nonconsensual pornography” or “revenge porn”). The law represents a long-awaited step forward in establishing nationwide protections for victims of image-based sexual abuse.
For survivors – especially those navigating the intersecting harms of domestic violence and digital exploitation – this moment is both deeply significant and fraught with questions. While the law provides powerful new tools to pursue justice and seek the removal of nonconsensual intimate images, it may also introduce challenges for survivor privacy and protected expression.
As a leading national voice for survivors of domestic violence and technology-facilitated abuse, the National Network to End Domestic Violence (NNEDV) strongly supports the law’s intent to combat this devastating form of abuse. At the same time, we also believe that its implementation must be survivor-centered, privacy-protective, and constitutionally sound.
Why This Legislation Matters So Deeply to Survivors
Victims and survivors of domestic violence have long recognized that the threat image-based sexual abuse poses is not hypothetical: it is a reality with profound impacts on their lives. For years, NNEDV’s Safety Net Project has provided technical assistance and guidance to support victims whose abusive partners and ex-partners were using intimate images as tools of coercion, sextortion, and retaliation. While survivors of domestic violence have indeed borne much of this harm, they are far from alone.
In recent years, the rapid rise of generative artificial intelligence (AI) and particularly deepfake technology (AI-generated synthetic media that can often look authentic) has dramatically expanded the reach of this abuse. Today, even people who have never taken or shared an intimate image can still find themselves targeted. AI tools now make it possible to fabricate hyper-realistic sexual images – what the Cyber Civil Rights Initiative (CCRI) terms “digital forgeries” – of anyone. From celebrities like Taylor Swift to women politicians, and even middle school students, no one is immune to this abuse or its long-lasting, devastating impacts.
As AI-generated nonconsensual intimate images became more common, so too did a necessary shift in public perception. The victim-blaming and “slut-shaming” that had long impeded necessary legal reform lost much of their rhetorical force. NDII could no longer be misperceived as the consequence of a victim’s personal failing but had to be recognized for what it is: a widespread vulnerability with insufficient legal protections to mitigate its harms, and a form of abuse that was never about what victims or survivors had done, but about what perpetrators could do to them.
In that moment, the harm became not only more visible, but universally imaginable. The realization that now anyone could be a victim of nonconsensual intimate imagery transformed NDII into a broadly recognized civil and human rights concern, and a systemic threat. Victims, survivors, and advocates had long known this truth; now, the rest of the world was finally beginning to catch up.
Delays in removal amplify the harms of these images: harassment, emotional and psychological distress, reputational damage, and threats to employment, housing, and safety. Despite the gravity of these impacts, many survivors currently face a long and often fruitless process when trying to have these images removed, often across multiple platforms. As organizations like CCRI and NNEDV have long advocated, swift and reliable mechanisms for takedown and accountability are absolutely essential. Importantly, a federal law that creates this process protects survivors from having to navigate an uneven framework of state laws to seek recourse.
The TAKE IT DOWN Act is the first federal law to meaningfully tackle this issue, and it does so through a two-pronged approach. The Act makes it a federal crime to publish nonconsensual intimate imagery using an interactive computer service – any website, app, or platform that lets people share, post, or respond to content from others. It draws heavily from the language of the SHIELD Act and expands its reach by applying not just to authentic nonconsensual intimate images, but to certain AI-generated, or synthetic, NDII content as well. The Act also establishes a rapid takedown process, requiring certain platforms that host user-generated content to remove reported material within 48 hours of receiving a complaint. Failure to do so could be considered an unfair or deceptive practice under the jurisdiction of the Federal Trade Commission (FTC).
A Crucial Step Forward, But Not without Risk
There is no question that the TAKE IT DOWN Act represents a historic step forward toward recognition of the harms caused by image-based abuse and a meaningful attempt to address them. For years, victims, survivors, and advocates have called for both enhanced accountability and an effective process by which to remove nonconsensual intimate images from platforms. In that respect, this law is responsive to a longstanding demand.
However, it also raises serious implementation concerns. As written, TAKE IT DOWN’s broad mandates and lack of built-in safeguards could unintentionally harm the very people it aims to protect. Survivors of domestic violence already navigate circumstances with elevated levels of risk – physical, financial, reputational, emotional – and novel legislative protections must avoid introducing new vulnerabilities into their lives.
Numerous stakeholders have expressed concerns about how TAKE IT DOWN’s provisions will function in execution. Some of these key challenges, which NNEDV is monitoring closely as this law goes into effect, include:
Exploitable Takedown System Could Silence Lawful Expression: Because this law requires platforms to remove reported NDII within 48 hours, this tight timeline may push platforms to prioritize speed over accuracy, leading to automated or rushed takedowns – regardless of whether the reported content is unlawful. As a result, bad-faith actors may exploit this system to suppress legal, consensual content that they simply dislike, under the guise of NDII reporting, while platforms may engage in broad takedowns of all reported content to minimize their own legal risk.
Loopholes for the Worst Actors: The definition of “covered platforms” applies only to sites that host user-generated content. As such, it exempts those that “curate” or pre-select content, including many known revenge porn and deepfake sites that host and distribute abusive content by design. This creates a gap in coverage that could render the law toothless against some of the worst offenders.
Privacy Risks for Survivors: Because it is missing key clarification to indicate otherwise, TAKE IT DOWN could pressure platforms to apply takedown compliance mechanisms to private communications, such as direct messaging and cloud storage, by scanning them with hash-matching or content-filtering technologies. These technologies are imperfect, and their use in settings where users have a reasonable expectation of privacy may undermine user trust and ultimately compromise survivor privacy.
Lack of Procedural Safeguards: The law does not require complainants to provide basic information, like the URL of the offending content, nor does it include penalties for false or malicious reports, even in cases of repeated bad-faith reporting. This could lead to a flood of vague or bad-faith complaints that could strain moderation systems and potentially harm legitimate users.
Ambiguity Around Deepfakes: The law defines harmful deepfakes using a “reasonable person” standard, which may not cover manipulated content that is obviously inauthentic but still deeply harmful, especially when used to humiliate survivors.
Moving Forward: Survivor-Centered Implementation is Key
The passage of the TAKE IT DOWN Act is a moment of both hope and urgency. Victims and survivors of nonconsensual image abuse deserve legal protections that both address the serious harms of nonconsensual intimate images while mitigating the risk of undermining any survivor’s free expression or privacy rights.
As always, NNEDV will continue working to ensure that implementation processes center survivors’ needs. We urge policymakers, agencies, and platforms to:
Develop clear, transparent reporting mechanisms that are trauma-informed and accessible;
Establish robust safeguards to protect against false reporting and ensure due process; and
Close regulatory loopholes to ensure all exploitative platforms, not just mainstream social media, are held accountable.
As this law is implemented, it is incumbent on us to ensure that survivors and the coalitions supporting them are both heard and protected. We will continue monitoring developments, sharing guidance, and advocating for survivor-centered processes.
If you or someone you know has been impacted by image-based sexual abuse, we encourage you to visit the Cyber Civil Rights Initiative’s Safety Center for a step-by-step guide on how to request the removal of harmful content. For questions, technical assistance, or additional support – especially in the context of domestic violence – please don’t hesitate to contact us.