Citizen Portal
Sign In

Lifetime Citizen Portal Access — AI Briefings, Alerts & Unlimited Follows

Senate Commerce Committee debates Section 230’s future as families, experts urge new safeguards for children

Senate Committee on Commerce, Science, and Transportation · March 18, 2026

Loading...

AI-Generated Content: All content on this page was generated by AI to highlight key points from the meeting. For complete details and context, we recommend watching the full video. so we can fix them.

Summary

Lawmakers and witnesses at a Senate Commerce Committee hearing traded sharply different views on whether the 1996 liability shield known as Section 230 should be preserved, limited, or conditioned on transparency, privacy and interoperability, while parents and plaintiffs’ attorneys urged clearer legal pathways for harms to children online.

The Senate Committee on Commerce, Science, and Transportation examined whether the 1996 statute commonly called Section 230 still serves its original purpose and how Congress should respond to harms attributed to modern platforms.

Chairman Cruz opened the hearing by saying Section 230 protected early online speech but has enabled large platforms to act as “speech police,” at times suppressing dissent and cooperating with government pressure. He and other supporters of reform said targeted legislation — such as the Take It Down Act and the Terms Act mentioned in testimony — could protect children and preserve robust public debate without broadly eliminating liability protections.

Witnesses disagreed about the risks and the remedies. Daphne Keller, identified in her testimony as director of platform regulation at Stanford Law School’s program in law, science and technology, warned that repealing Section 230 or broadly stripping immunity would likely reduce online speech, raise legal uncertainty, and entrench incumbent firms by imposing litigation costs that only large companies could withstand. Keller said courts are already sorting questions such as when platform design counts as a platform’s own conduct rather than third‑party speech, and she urged Congress to prioritize reforms that empower users (for example, interoperability and middleware) and strengthen privacy rather than wholesale repeal.

Nadine Fareed Johnson, policy director at the Knight First Amendment Institute, told senators that Section 230 can be preserved while conditioning its protections on demonstrable transparency, limited data practices, and researcher access. She outlined three priorities — researcher safe harbors, data‑privacy rules, and interoperability mandates — that she said could both protect users and allow smaller platforms and experimental approaches to thrive.

Matthew Bergman, founding attorney at the Social Media Victims Law Center, described lawsuits his firm has brought and the stories of three families whose children died after being targeted or harmed on social platforms. He urged Congress to clarify that deliberate design features — "infinite scroll," notifications, and other engagement mechanics — can give rise to product‑liability or negligence claims and should not be immune under Section 230.

Brad Carson, president and cofounder of Americans for Responsible Innovation, emphasized that generative AI raises different questions from third‑party content because companies design and deploy models whose outputs are produced by the firms themselves; he urged Congress to clarify liability for AI outputs and cautioned against preemptive federal immunity that could replicate the effects critics attribute to Section 230.

Lawmakers pressed witnesses on several themes: whether courts should resolve design‑liability questions or whether Congress should write clearer statutory boundaries; whether transparency and narrowly drawn disclosures about moderation and algorithms could survive constitutional scrutiny; and whether conditioning Section 230 protections on privacy, interoperability, and researcher access could preserve smaller competitors while imposing stronger public protections. Senators on both sides repeatedly invoked the need to protect children and flagged bipartisan bills already under consideration, including COPPA 2 and measures to promote interoperability.

No formal votes were taken at the hearing. Chairman Cruz set a schedule for follow‑up written questions and adjourned the committee after thanking witnesses and the parents who testified.

Why it matters: The hearing brought into relief two consistent tensions — how to address documented harms to children and other vulnerable users while protecting free expression and preserving competitive entry — and sketched three distinct policy tracks legislators could pursue: (1) rely on courts and product‑liability law to handle design‑based harms; (2) condition Section 230 protections on demonstrable transparency, privacy, and interoperability; or (3) pursue narrower statutory remedies such as nonconsensual intimate‑image takedown rules already advanced in the Take It Down Act.

What’s next: Committee members have deadlines to submit questions for the record; witnesses were asked to respond in writing. Several senators said they will continue drafting or refining bills that attempt to pair liability rules with transparency, data‑privacy and interoperability requirements.