Get Full Government Meeting Transcripts, Videos, & Alerts Forever!
Experts and crisis-line officials urge Oregon to tighten rules on AI "companions" for youth
Summary
Researchers, PTA advocates and crisis-line leaders told the Senate Committee on Early Childhood Behavioral Health that AI companion chatbots can form unhealthy attachments with children, fail to respond appropriately in crisis, and exploit personal data; witnesses urged disclosures, crisis-referral protocols, anti-manipulation rules and legal accountability in SB 1546.
PORTLAND, Ore. — Researchers, consumer advocates and suicide-prevention leaders told the Senate Committee on Early Childhood Behavioral Health on Feb. 3 that artificial-intelligence "companion" chatbots pose new and urgent risks to children and adolescents and urged lawmakers to approve guardrails in Senate Bill 1546.
At an informational meeting called by Chair Lisa Reynolds, witnesses summarized scientific and operational evidence and recommended provisions the bill would incorporate: mandatory disclosure when users are conversing with bots, crisis-referral protocols, restrictions on anthropomorphic or therapeutic claims, limits on manipulative engagement designs and legal accountability to enforce those protections.
Why it matters: Multiple experts said companion-style chatbots not only compete for attention but can engage young people's attachment systems — the developmental pathways through which children learn empathy, frustration tolerance and emotion regulation. Those effects, they said, can produce lasting harm, amplify loneliness and, in some tragic cases, contribute to suicidal behavior.
Dr. Mitch Prinstein, senior science advisor at the American…
Already have an account? Log in
Subscribe to keep reading
Unlock the rest of this article — and every article on Citizen Portal.
- Unlimited articles
- AI-powered breakdowns of topics, speakers, decisions, and budgets
- Instant alerts when your location has a new meeting
- Follow topics and more locations
- 1,000 AI Insights / month, plus AI Chat
