Citizen Portal

Senators hear whistleblower say Meta targeted teens’ emotional states to sell ads

3019083 · April 9, 2025

Get AI-powered insights, summaries, and transcripts

Subscribe
AI-Generated Content: All content on this page was generated by AI to highlight key points from the meeting. For complete details and context, we recommend watching the full video. so we can fix them.

Summary

Sarah Wynne Williams told a Senate subcommittee that Facebook/Meta tracked signals from 13–17‑year‑old users—examples included deleted selfies and signs of low self‑worth—and provided those signals to advertisers. Senators cited the Kids Online Safety Act and earlier whistleblowers as context while pressing Meta’s accountability.

Former Facebook policy director Sarah Wynne Williams told the Senate Judiciary Subcommittee that Meta used behavioral signals from teenagers to target advertising and optimize engagement, a practice senators said prioritized profit over child safety.

Williams described internal research and product design decisions that identified signals such as when a 13‑ to 17‑year‑old “deleted a selfie” and used those signals to serve targeted ads; she said advertisers were given cues to reach users when they were feeling “worthless or helpless.”

Senator Marsha Blackburn, a co‑sponsor of the Kids Online Safety Act, told Williams the targeting approach was "completely disgusting," and described Meta’s conduct as placing revenue ahead of child welfare. Senator Amy Klobuchar and others referenced earlier committee bills and past whistleblowers as evidence that platforms have repeatedly prioritized engagement metrics.

Why it matters: Committee members said the testimony underlines regulatory gaps and reinforced previous bipartisan efforts such as the Kids Online Safety Act, which passed the Senate previously but did not become law.

Details cited at the hearing included Williams’s account that: - Advertisers received signals that indicated a teen’s emotional state and could serve ads tailored to those moments (example given: deleted selfies used to serve beauty-product ads). - Internal discussions referenced targeting “13 to 17 year olds” by surfacing indicators such as body‑confidence issues and emotional vulnerability. - Meta executives were aware of harms and sometimes restricted their own children’s access to company products, while continuing business practices that monetized teen engagement.

Senators said the testimony supported ongoing legislative efforts. Multiple members reiterated that if firms can rapidly remove or suppress content when motivated (for example, for compliance or to appease foreign governments), the companies possess equivalent engineering capability to address nonconsensual intimate‑image removal, fentanyl sales, and other online harms when prioritized.

No formal votes took place during the hearing. Senators requested additional documents and pledged to press for legislative remedies addressing child safety and platform accountability.