Witnesses urge Vermont to enshrine neural-data protections and require AI disclosure in mental‑health tools

House Committee on Health Care · February 20, 2026

Loading...

AI-Generated Content: All content on this page was generated by AI to highlight key points from the meeting. For complete details and context, we recommend watching the full video. so we can fix them.

Summary

Two expert witnesses told the House Committee on Health Care that wearable devices and AI could reveal sensitive mental-state information and called for statutory protections, written consent rules and mandated disclosure when generative AI or chatbots are used in patient communications.

Lawmakers heard testimony Tuesday on a bill that would create a statutory framework for "neurological rights" and restrict how companies collect, use and share neural data. Witnesses told the House Committee on Health Care that consumer devices and generative AI used in mental‑health tools can capture uniquely sensitive brain signals and that Vermont could lead national policy.

"The recognition that each individual has rights to mental and neural data privacy and protection from unauthorized neurotechnological manipulation places Vermont at the forefront," said Sean Posowski, a practicing neurologist and medical director at the NeuroRise Foundation, who spoke in support of the bill and offered suggested statutory language. Posowski emphasized the need for written informed consent and called the bill’s ban on "consciousness bypass" — obtaining consent in a way that undermines an individual’s decision‑making capacity — a "profound and necessary safeguard."

Posowski described clinical uses for longitudinal neural data: remote seizure monitoring for people with epilepsy, earlier detection of cognitive decline and potential signals that could indicate suicidal risk in severe depression. He said these advances can "save lives" but warned that, unlike clinical data protected by HIPAA, consumer device data currently faces few guardrails.

Ashley Collins, a human‑rights lawyer and legal advisor to the Neurorights Foundation, told the committee that consumer neurotechnologies are largely unregulated and that a 2024 review her organization prepared found that 29 of 30 consumer neurotechnology companies failed to meet basic privacy benchmarks. "Neural data is capable of revealing intimate information about consumers, including information about individual mental states," Collins said, arguing that state legislation often defines neural data as sensitive personal information to extend privacy protections to device users.

Both witnesses pointed to recent international and U.S. precedents. Collins cited Chile’s constitutional amendment protecting mental integrity and state action in California, Colorado, Montana and Connecticut that has extended consumer privacy protections to neural data. She referenced the UN Human Rights Council’s 2022 resolution and advisory work as additional impetus for state‑level protections.

Committee members asked for concrete examples and next steps. In response, Posowski listed consumer products such as Muse headbands and earbuds with neural sensors, and pointed to prototypes and patents from major technology firms that could bring neural sensing to mass markets. He offered to provide written amendments and his 2024 report for the committee record; Collins also agreed to share the foundation’s review and to recommend additional witnesses.

Lawmakers pressed witnesses on safeguards against corporate misuse, possible discrimination by insurers, and how nontechnical legislators can evaluate complex neurotechnology policy. Collins advised relying on expert testimony, distilled models from other jurisdictions and clear, implementable statutory language.

The committee took no formal vote. Witnesses committed to submit suggested language and relevant reports to committee staff for further drafting and follow‑up.