Citizen Portal
Sign In

Get AI Briefings, Transcripts & Alerts on Local & National Government Meetings — Forever.

Senate Judiciary hearing urges Section 230 reform, broad federal action to protect children online

2439980 · February 19, 2025

Loading...

AI-Generated Content: All content on this page was generated by AI to highlight key points from the meeting. For complete details and context, we recommend watching the full video. so we can fix them.

Summary

Witnesses, victims and senators told the Senate Judiciary Committee that voluntary tech measures have failed and urged Congress to revise Section 230 of the Communications Decency Act and pass multiple bills to give parents, states and law enforcement new tools to prevent child exploitation online.

The Senate Committee on the Judiciary heard emotional testimony and bipartisan calls for legislative action on online child safety at a hearing convened by Chairman Chuck Grassley and Ranking Member Dick Durbin.

The hearing, featuring family members of victims, lawyers and child-safety experts, pressed for reforms to Section 230 of the Communications Decency Act and for passage of several bills—some previously reported out of committee or passed by the Senate—to expand enforcement, civil remedies and technical requirements for platforms.

Why it matters: witnesses and senators said current industry self-regulation and the legal framework embodied in Section 230 have left victims without a realistic path to court and have diminished public and law-enforcement oversight. "We cannot wait," Ranking Member Dick Durbin said, noting the committee previously reported multiple bills and that Congress must give families and states more tools. Representative Brandon Guffey, who lost his son to an online predator, told the committee that platforms' actions amount to "profiting at the expense of our children's privacy, our children's safety, and our children's health." Representative Guffey urged lawmakers to hold platforms accountable.

Most important facts: committee members and witnesses repeatedly cited statistics from the National Center for Missing and Exploited Children (NCMEC) and law-enforcement groups: about 36.2 million cyber tips were submitted to NCMEC in 2023; NCMEC reported roughly 61,000 instances of AI-generated CSAM in 2024; witnesses said the cyber tipline receives roughly 100,000 reports per day. Law-enforcement witnesses and advocates described an explosion in suspected child sexual-abuse material (CSAM) and in platforms' use by traffickers and sellers of illicit drugs such as fentanyl.

Testimony and proposals: Representative Brandon Guffey described his son's death and the gaps he said remain in platforms' responses, saying platforms removed some offending accounts but left others active. "Either get in line or get offline," Guffey told the committee. Attorney Carrie Goldberg described lawsuits she has brought against multiple platforms and said Section 230 has regularly been treated by courts as an immunity that ends cases at the motion-to-dismiss stage, preventing discovery that might reveal platforms' internal practices. Professor Mary Leary, a former federal prosecutor, advised preserving the "good Samaritan" element of Section 230 while removing the c1 protections that have been interpreted as near-absolute immunity and argued for allowing state attorneys general to enforce state law.

Lawmakers and witnesses discussed a slate of measures that have been proposed or previously advanced out of committee: the Report Act (which the committee said was later signed into law and strengthened NCMEC's cyber tipline), the Kids Online Safety Act (said to have passed the Senate 91–3 but stalled in the House), the Stop CSAM Act (to be reintroduced), the Shield Act, the Take It Down/Take It Down Act, the No Fakes Act targeting AI-generated likenesses, and other bills addressing age verification, app-store accountability and civil remedies. Senator Dick Durbin and others urged a recklessness standard (rather than a knowing standard) when imposing duties on platforms, while Professor Leary characterized recklessness as an appropriately demanding legal standard that would still allow meritorious cases to proceed.

Technical and enforcement problems: witnesses described technological challenges, including AI tools that can create or alter images to appear to show children, the difficulty of distinguishing AI-generated material from authentic images without forensic tools, inconsistent reporting to NCMEC by platform providers, inadequate voluntary cooperation with law enforcement, and underfunding of Internet Crimes Against Children (ICAC) task forces (witnesses said ICAC is authorized at $60 million but that only about $31.9 million has been appropriated). John Pizarro, CEO of Raven, said many IP addresses trading CSAM are not being worked by investigators and urged reauthorization and funding for law enforcement task forces.

Bipartisan emphasis and remaining hurdles: senators from both parties expressed urgency but acknowledged political and procedural obstacles. Committee members said prior bipartisan committee votes show the issue can unite the panel, but witnesses and senators described a powerful industry lobbying presence that has impeded enactment in the House and slowed broader reform.

What did not happen at this hearing: no formal committee votes were taken on new bills at this session. Several sponsors said they plan to refile or reintroduce measures and requested that the chair schedule markups.

Ending note: senators closed by leaving the record open for further statements and questions for the record; multiple senators asked for additional written follow-up on AI-generated CSAM, platform parental controls, and harmonized safety tools across devices.