Lifetime Citizen Portal Access — AI Briefings, Alerts & Unlimited Follows
Parents and lawyers urge courts or laws to let harms from platform design proceed in court
Loading...
Summary
Attorney Matthew Bergman and family members described three cases of teenagers harmed after interactions on social platforms and argued that certain platform design features should not be insulated by Section 230; the committee debated whether courts or new laws should decide liability for those design choices.
Families who have lost children appeared in the hearing as part of testimony brought by attorney Matthew Bergman. Bergman summarized written testimony and video clips about three young people who, he said, were targeted or materially harmed through platform interactions or content algorithms. He argued those harms stemmed from what he described as deliberate design decisions — including recommendation algorithms, notifications and features like "streaks" and infinite scroll — that create addictive patterns for adolescents and can lead to real‑world injury.
Bergman said his litigation strategy focuses on distinguishing platform content moderation from defective product design. He cited cases such as Lehi v. Snap (speed filter case) and Barnes v. Yahoo as legal touchpoints that courts have used to separate content immunity from platform conduct that can create liability. He urged Congress and the courts to permit claims that focus on product design and negligent features to proceed rather than allowing Section 230 to preclude them entirely.
Witnesses and many senators agreed the harms described are grave. Several senators asked whether modifying Section 230 or clarifying its scope would promote platform safety; others cautioned that broad repeal could chill speech and disadvantage small platforms. Witnesses suggested several possible legislative responses: (1) condition Section 230 protections on compliance with transparency, privacy and interoperability requirements; (2) create clear protections for academic and independent researchers; and (3) allow product‑liability or negligence claims to proceed when they concern the platform’s own design choices rather than third‑party content.
The committee heard detailed descriptions of product features alleged to contribute to harm, but it did not take any legislative action during the session. Several senators signaled interest in pairing liability clarification with privacy and interoperability measures to protect consumers while preserving avenues for speech.

