Senators press Big Tech on AI, deepfakes, youth safety and the fate of local news
Get AI-powered insights, summaries, and transcripts
SubscribeSummary
Committee members used the hearing to probe how AI products and platform policies affect misinformation, deepfakes, teen safety, and the economic viability of local journalism. Google acknowledged LLM "hallucination" risks in Gemini; Meta defended policy changes and said it intends to improve provenance and labeling for AI-generated content.
Senators used the Commerce Committee hearing to press platform witnesses about artificial intelligence, deepfakes, the online safety of teenagers, and the erosion of local journalism.
Senator Amy Klobuchar and others focused on the proliferation of digitally altered political videos and the need for provenance and labeling. Klobuchar noted pending legislation to require labeling of certain AI-generated media and asked Meta and Google whether they supported measures to mark altered content. Neil Potts said Meta supports directionally more transparency for AI-generated posts and pointed to industry efforts such as the Coalition for Content Provenance and Authority (CTUPA) that include markers for AI content.
Senator Marsha Blackburn pressed Google on Gemini, citing a constituent example she said involved fabricated accusations produced by the model. Erickson acknowledged that "LLMs will hallucinate. It's a known issue," said Google has teams working to mitigate hallucinations and that the company trains models on publicly available information.
Senators also raised youth safety. Blackburn and Senator Blackburn's questioning (sic) included allegations about platformsand teen exposure to harmful content; Meta said it is investing in safety and privacy measures and that it would follow up with committee offices on research and methodologies. Witnesses described differences in how platforms approach safety and said they are continuing to refine controls and enforcement.
Several senators linked AI and platform concentration to threats to local journalism and the "seed corn" of reporting. Senator Cantwell and others warned that closures of local news outlets, combined with opaque AI training and recommendation systems, risk hollowing out the information ecosystem. Google said it directs users to publishers and supports news ecosystems; Meta described grants and partnerships it has provided to some journalism organizations.
The exchange included broader questions about algorithms and engagement. Senators asked whether algorithmsdesigned to maximize time on sitenarrow the range of information users encounter and whether Section 230 immunity should apply when platforms actively amplify or demote content through recommendation systems. Witnesses urged caution about overbroad regulatory steps but expressed openness to discussion.
Lawmakers signaled they will press these issues further in future hearings and follow-up questions for the record.
