Limited Time Offer. Become a Founder Member Now!

Advocates call for state and federal action to protect children online, citing whistleblower testimony and rising harms

October 09, 2025 | National Eagle Forum, Utah Lobbyist / NGO, Utah Legislative Branch, Utah


This article was created by AI summarizing key points discussed. AI makes mistakes, so for full details and context, please refer to the video of the full meeting. Please report any errors so we can fix them. Report an error »

Advocates call for state and federal action to protect children online, citing whistleblower testimony and rising harms
A presenter representing a nonprofit organization warned that children face increasing harms online and urged a mix of federal and state measures to reduce risk, citing whistleblower testimony, rising cybertipline reports and recent state actions.

The presenter said evidence shows a sharp rise in online enticement and sextortion and described recent Senate hearings in which two whistleblowers from Meta (Facebook/Instagram) and other platforms testified. “We’re not making this up. We’re not alarmists,” the presenter said, adding that researchers and public-private groups such as the National Center for Missing and Exploited Children (NCMEC) document increases in harmful online contacts.

The presenter cited several statistics and legal developments discussed during the talk: a reported 323% increase in online enticement of children between 2021 and 2023 (as referenced to cyber tip-line data), research saying “more than 1 in 3 minors reported having had an online [sexual] interaction” and that 28% believed the other party was an adult, and a recent appellate decision (NetChoice v. Paxton) that the presenter said recognized a government interest in protecting children online. The presenter also described high-profile legal and legislative responses, including the Take It Down Act and efforts on a Kids Online Safety Act that was said to have passed the Senate by a 91–3 margin but lacked House consideration at the time cited.

Discussion in the presentation stressed limits to platform accountability tied to Section 230 of the Communications Decency Act. The presenter called Section 230’s broad liability shield a “blanket” that, in their view, reduces incentives for platforms to test products for safety and to mitigate harms when they arise. “This lack of liability . . . has caused this utter irresponsibility,” the presenter said.

State-level solutions highlighted in the presentation included age-verification laws and device-safety requirements. The presenter credited lawmakers and advocates in Utah and Alabama with passing measures that (as described in the talk) require filters on phones to default to on and limit under-18 downloads or in-app purchases without parental consent. The presenter said Apple planned to ship an iPhone update with filters defaulted on and credited state action for influencing that decision.

The presentation also raised concerns about AI “companion” chatbots and virtual-reality environments, describing testimony that children can form human-like attachments to bots and that interactions in VR can feel physically real to a child. The presenter said whistleblowers testified that platforms did not sufficiently test products prior to launch and that lawyers sometimes advised engineers to avoid documenting known harms because of legal risk.

The presenter closed by urging bipartisan legislative remedies that would create a duty of care for platforms, require safety-by-design testing, and strengthen takedown and human-support requirements for nonconsensual explicit material. They offered to provide fact sheets and model language to state legislators and advocates.

View full meeting

This article is based on a recent meeting—watch the full video and explore the complete transcript for deeper insights into the discussion.

View full meeting

Sponsors

Proudly supported by sponsors who keep Utah articles free in 2025

Excel Chiropractic
Excel Chiropractic
Scribe from Workplace AI
Scribe from Workplace AI