Citizen Portal
Sign In

Lifetime Citizen Portal Access — AI Briefings, Alerts & Unlimited Follows

Senate subcommittee hears bipartisan push to strengthen law on child sexual abuse material, platform reporting and takedown

2759794 · March 11, 2025

Loading...

AI-Generated Content: All content on this page was generated by AI to highlight key points from the meeting. For complete details and context, we recommend watching the full video. so we can fix them.

Summary

Senator Josh Hawley, chairman of the Senate Judiciary Subcommittee on Crime and Counterterrorism, opened the hearing by urging Congress to give child victims greater legal recourse and platform accountability.

Senator Josh Hawley, chairman of the Senate Judiciary Subcommittee on Crime and Counterterrorism, opened the hearing by urging Congress to give child victims greater legal recourse and platform accountability. "It is time to allow victims to have their day in court," Hawley said, describing legislation he and others support to expand victims' rights, strengthen reporting to the National Center for Missing and Exploited Children (NCMEC) and create a process to take down and prevent republishing of CSAM.

The hearing presented data from witnesses and survivor testimony to explain why lawmakers say new rules are needed. Michelle DeLon, president and chief executive officer of the National Center for Missing and Exploited Children, told the subcommittee that reporting to NCMEC’s cyber tip line rose to 36,200,000 reports in 2023 and fell to 20,500,000 in 2024; when adjusted for reporting “bundling” the organization estimates about 29,200,000 incidents in 2024 and a net decrease of about 7,000,000 reports that year. DeLon said platforms’ inconsistent reporting and changes such as end‑to‑end encryption on some services contribute to gaps and called for statutory minimum reporting details and transparency requirements.

Witnesses described proposed provisions in the Stop CSAM Act and related bills. DeLon summarized the bill as expanding victim protections in court, broadening restitution, strengthening the cyber tip line, establishing a report‑and‑remove program for survivors, and creating a child online protection board to adjudicate takedown requests. John Pizzaro, CEO and cofounder of Raven and a former New Jersey State Police commander, told senators the bill would help protect victims’ privacy in discovery and preserve remedies for survivors who come forward years after abuse.

International Justice Mission executive director John Tinago said live‑streamed child sexual exploitation and “pay‑per‑view” CSAM are major sources of new material and emphasized that the problem crosses borders: IJM’s research estimates hundreds of thousands of children are victimized through live streaming in the Philippines, with a significant share of cases involving U.S. offenders. "This is pay per view CSAM," Tinago said, adding that companies can detect many live‑streaming markers but often do not.

Greg Shiller, chief executive officer of the Child Rescue Coalition and a former federal prosecutor, and Raven’s Pizzaro described investigative and prosecutorial limits that the bills aim to address: funding and statutory authority for guardian ad litem appointments, appointment of trustees to hold restitution for minors and foreign victims, and tighter requirements for the information platforms must submit when they report to NCMEC so law enforcement can act promptly.

Survivor Taylor Saenz described being targeted as a teenager, the rapid spread of intimate images, and the mixed responses she received when seeking help. Saenz credited a cybersecurity detective and her prosecutor with supporting her criminal case; she urged lawmakers to require platforms to take non‑consensual intimate images down quickly. "This happened to me, not because of me, and it does not define me," Saenz said.

Members pressed witnesses on specific concerns. Senators asked whether platform transparency rules and a statutory ‘‘duty of care’’ or age‑verification requirements should be adopted; witnesses urged a mix of prevention, design changes, clearer reporting standards and stronger civil and privacy protections for victims. Multiple witnesses warned that generative artificial intelligence is already amplifying CSAM: DeLon said NCMEC has seen a sharp increase in AI‑generated child sexual imagery and asked for statutory authority to share data with technology firms developing detection tools. Pizzaro said he has seen examples of AI being used to create abusive images and estimated a sharp year‑over‑year increase in AI‑created material.

Senators from both parties described complementary bills related to takedown and prevention: the Testimony referenced the Report Act (reported as recently enacted), the Kids Online Safety Act, the Take It Down Act and the Shield Act. Witnesses urged that some reporting and takedown standards in guidance be codified into statute to ensure uniformity across providers.

The subcommittee did not take formal votes. Senators and witnesses repeatedly returned to two themes: that platforms currently supply uneven and sometimes inadequate information to NCMEC and law enforcement; and that survivors need stronger, consistent privacy protections, easier takedown remedies and legal paths to civil relief and restitution.

The hearing closed with an appeal from members and witnesses for Congress to act on multiple fronts—platform transparency and liability, stronger reporting, funding for guardians ad litem and restitution trustees, and measures to address the growing threat from generative AI—so law enforcement and survivors can better prevent and respond to online child sexual exploitation.