Senate Approves Bill to Regulate AI 'Companion' Chatbots for Minors, Adds Emergency Intervention Requirements
Loading...
Summary
SB 796 requires AI operators to disclose non‑human status, provide emergency interventions when minors are at imminent risk, and implement parental consent modes; the Senate passed the bill nearly unanimously (Ayes 39, Noes 1).
The Senate passed SB 796 on Feb. 17, 2026, establishing new restrictions and consumer‑protection requirements for AI "companion" chatbots when interacting with minors. The sponsor, the senator from Stafford, explained the bill’s provisions after a negotiated floor amendment.
Key features include a rule that chatbots may not materially represent themselves as human, a 24‑hour emergency intervention requirement when an operator has actual knowledge a minor is at imminent risk, and limited‑access modes and parental‑consent mechanisms for child users. The sponsor said the bill addresses documented cases in which minors have formed unhealthy emotional dependencies on chatbots.
The senator from Western Prince William supported the intent but cautioned against vague statutory language that could chill legitimate content. "We do not want chat bots trying to encourage kids to kill themselves," the senator said; the sponsor and colleagues worked to tighten definitions to avoid unintended censorship of music or artistic discussion.
The Senate recorded a final vote of Ayes 39, Noes 1. Sponsors said the law balances child safety and free expression while establishing enforcement and training duties for relevant state agencies.
Next steps: The bill moves to the House; implementing agencies will develop enforcement and technical guidance if the legislation is enacted.

