Citizen Portal
Sign In

Get AI Briefings, Transcripts & Alerts on Local & National Government Meetings — Forever.

Connecticut House advances broad AI responsibility and safety bill after hours-long debate

Connecticut House of Representatives · May 1, 2026
AI-Generated Content: All content on this page was generated by AI to highlight key points from the meeting. For complete details and context, we recommend watching the full video. so we can fix them.

Summary

The Connecticut House debated and passed a far-reaching Artificial Intelligence Responsibility and Transparency Act (the CARD Act) that sets consumer disclosures, whistleblower protections for high-capacity developers, rules for AI companions with minors and a regulatory sandbox; supporters said it balances safety and innovation, critics urged clearer cost and enforcement details.

Representative Lamar, the bill's floor sponsor, told lawmakers the measure "creates some common-sense rules for artificial intelligence" to protect consumers while allowing innovation. The House's extended floor debate centered on how the state should regulate large AI developers, guard minors, and adapt workforce training.

Proponents framed the bill as a mix of safeguards and economic development. Representative Turco, the bill's vice chair, described provisions that would give the attorney general authority to enforce disclosure and safety standards and require large platforms to embed provenance data in synthetic content. He told the chamber the law would protect children and vulnerable users, require transparency when AI is used in hiring, and create a regulatory sandbox and a Connecticut AI Academy to support training and research.

Opponents pressed for details about enforcement, private-rights-of-action and the long-term budget impact. Representative Retigliano asked whether employer and small-business burdens were addressed and sought clarity on how age verification and compliance would be enforced; sponsors said the bill includes a 60-day cure period for small businesses and gives the attorney general primary enforcement authority while preserving limited private rights for harm to minors.

Members also debated a section addressing so-called —20— tariffs—7— technical procurement targets and pilot programs for independent verification of algorithms. Supporters urged that the bill fills a gap while federal rules are still in flux; critics warned the state should avoid overreach that would slow innovation or impose costs without clear benefits.

The measure includes whistleblower protections for employees of "frontier" AI developers, disclosure rules for subscription AI products, rules requiring chatbots that act like "companions" to identify themselves to users (with more frequent notices for minors), and requirements that large providers maintain anonymous internal reporting channels for potential catastrophic risks. It also tasks agencies with building a regulatory sandbox and funding an AI Academy through higher-education partners.

The House adopted the bill after debate and amendments; sponsors said agencies and regulators will write implementation rules, and the attorney general will handle enforcement actions. Supporters said the law is a starting framework that can be updated as technology and markets evolve; opponents said they will watch rulemaking and budgetary impacts closely.