Citizen Portal

House hearing spotlights Take It Down Act as tool to remove nonconsensual intimate images of minors

2778690 · March 26, 2025

Get AI-powered insights, summaries, and transcripts

Subscribe
AI-Generated Content: All content on this page was generated by AI to highlight key points from the meeting. For complete details and context, we recommend watching the full video. so we can fix them.

Summary

Witnesses and lawmakers urged Congress to pass the Take It Down Act to create notice-and-removal obligations for platforms, fill investigative gaps around non‑CSAM exploitative images, and give victims new civil and criminal remedies.

Take It Down Act supporters told the House Energy and Commerce subcommittee that existing law leaves a gap allowing exploitative but not legally defined child sexual abuse material (CSAM) and AI‑generated intimate images of minors to circulate without effective law‑enforcement or civil remedies.

Why it matters: Witnesses said the bill would let victims and their advocates seek removal and would criminalize certain exploitative publications that fall short of current CSAM definitions, while establishing reporting standards so law enforcement receives usable information.

National Center for Missing and Exploited Children Chief Legal Officer Yota Suras told members that platforms often submit cyber‑tip reports lacking basic investigative details and that “Take It Down closes this gap by criminalizing the knowing publication of these images, whether real or created by Nudify apps or AI technology.” She urged a notice‑and‑removal mechanism and stronger reporting information requirements.

Dawn Hawkins of the National Center on Sexual Exploitation described survivors’ primary need as removal: “For many survivors, it’s not about prosecution; it’s about making sure those images do not continue to circulate.” Hawkins and others said voluntary takedown processes are inconsistent and that codifying a process would provide “teeth” to compel action by companies that currently ignore removal requests.

Lawmakers from both parties recounted constituent tragedies to frame urgency. Subcommittee members and the full committee chairmen repeatedly called for rapid passage of the Take It Down Act alongside other bills addressing children’s online safety.

What the bill would do: Witnesses and some members described three primary effects—(1) create a new criminal pathway for exploitative images that do not meet current CSAM thresholds, (2) require platforms to remove nonconsensual intimate images of minors after receiving notice, and (3) standardize cyber‑tip reporting to include victim/offender identifiers and content evidence so law enforcement can act.

Open questions and limits: Panelists discussed implementation details, including how to define timeframes for takedown and how to avoid impinging on lawful speech. Members asked witnesses about platform responsiveness; Suras said NCMEC publishes a list of companies that are “unacceptably slow” or nonresponsive and recommended statutory minimums for information in cyber‑tip reports.

The hearing closed with lawmakers pledging to move the bill quickly and to coordinate with Senate counterparts on final text. No formal action or vote occurred in the hearing.

Ending: Advocates said the bill would give victims a practical remedy—removal—while lawmakers signaled bipartisan support for advancing the measure through committee and to the House floor.