Witnesses and experts urge limits on deepfakes and disclosure rules for election media
Get AI-powered insights, summaries, and transcripts
SubscribeSummary
Supporters told the Government Administration and Elections Committee HB 5342 would deter deceptive synthetic media that can mislead voters and target election workers; opponents warned of First Amendment and broadcaster-liability risks, and lawmakers pressed witnesses on who should enforce violations.
Secretary of the State Stephanie Thomas told the committee she supports measures in HB 5342 to address deceptive synthetic media in an election context, saying she “continues to support” policies to protect voters and election workers and pointing to examples from other states where fabricated video and audio falsely depicted officials.
Supporters including the League of Women Voters and Public Citizen described how AI-generated audio and video can be persuasive and spread faster than fact-checkers can respond. Jennifer Dayton of the League said synthetic media “is already shaping how voters encounter information,” urging mandatory disclaimers so people know when content is inauthentic. Ilana Beller of Public Citizen told the committee that deepfakes “fabricat[e] content” that can impersonate officials, and argued distributors who knowingly spread such materials should be liable while ordinary re-posters should not.
Committee members focused questions on enforcement and scope. Senator Sampson asked whether state agencies, the State Elections Enforcement Commission, or the courts should adjudicate complaints; Thomas said her office would not investigate such cases and suggested judicial remedies or referral to committee leadership for policy design. Public commenters and several witnesses said most state laws penalize distributors who knowingly circulate material intended to affect an election, because creators are often hard to identify.
Broadcasters and free-speech advocates urged narrower drafting. Stephanie Pearl of the Connecticut Broadcasters Association said stations “do not and cannot take cognizance of authenticity” and urged liability hinge on advertiser or producer disclosures rather than imposing legal responsibility on distributors that simply air paid content. John Coleman of the Foundation for Individual Rights and Expression warned similar laws have raised First Amendment concerns in other states, citing court rulings in California and Hawaii.
The hearing highlighted tradeoffs lawmakers must weigh: supporters emphasized deterrence and protection for voters and election workers; industry groups and free-speech advocates flagged enforcement mechanics and constitutional risks. The committee heard detailed drafting and enforcement questions and heard witnesses offer to provide model language and lists of state examples for lawmakers to use as they refine HB 5342.
