MONTPELIER — The Senate Government Operations Committee on Thursday reviewed S.23, legislation that would compel disclosure of AI-generated “synthetic” media used to depict political candidates or influence voters in the period before an election.
Representative Shay Waters Evans, the ranking House sponsor, told the Senate the House edits were designed to clarify — not weaken — the bill and to make enforcement more straightforward for the attorney general’s office. “We don’t wanna stifle free speech. We just wanna make sure that people are aware of what they’re consuming,” Waters Evans said as she summarized changes that narrow the definition to media representing political candidates and add distribution to the bill’s scope.
The committee spent much of its discussion on three core tensions: how broadly to define “synthetic media,” whether liability should require actual knowledge or a reasonable-person “should have known” standard, and how the bill should treat distributors and sharers. Rick Seigel of the office of legislative counsel cautioned that the Senate penalty language used a knowing standard, and widening the mental-state requirement to a “should have known” reasonable-person test could make the statute broader and more vulnerable to First Amendment challenges. “It’s also a stronger First Amendment argument as well that you knew what you were doing, and you violated that,” Seigel said.
Committee members who worked on the bill said they aimed to keep the measure narrowly focused on election integrity — for example, to prevent last-minute deepfakes that purport to show a candidate saying or doing something they did not. Waters Evans and others described a disclosure requirement that would run during the pre-election window; committee members recalled the earlier committee language set a 90‑day disclosure window rather than 30 days.
The House revisions also removed prior prescriptive font-size language for visual disclaimers and replaced it with a requirement that disclosures be “easily readable by the average viewer.” Some senators raised accessibility concerns, urging explicit universal‑design standards and noting audio disclosures should be considered for people with vision or hearing disabilities.
The bill adds the word “distributes” to the prohibition so that people who pass along deceptive synthetic media could fall within the statute’s scope. Counsel and several senators flagged definitional questions about when sharing becomes distribution and whether liability could sweep in ordinary social‑media users; sponsors said the change was intended to close a loophole where an original producer could dodge responsibility by giving content to a third party.
The House draft also adds civil‑investigation authority for the attorney general modeled on existing campaign‑finance language. The committee recited the civil penalty structure described in the House language: first violations capped at amounts described in the bill and higher caps for intentional incidents that cause violence or repeat offenses; members said the specific caps were unchanged from the House draft and urged staff to confirm the exact figures in the statutory text.
Witnesses representing industry and advocacy groups spoke in favor of and against the narrower definition. Dylan Zwicky of Leonine, speaking for the New England Connectivity and Telecommunications Association, said the association remained supportive of the bill as amended and welcomed clarifications that would aid enforcement. “We would support the bill as it came out of this committee and the senate and remain supportive with changes in the house,” Zwicky said. Broadcasters said they supported the House changes; consumer and voter‑advocacy witnesses expressed concern that narrowing the definition to political candidates could leave issue campaigns, surrogates or fabricated endorsements outside the statute’s reach.
Several senators illustrated the gap with hypotheticals — a celebrity endorsement created by AI or a synthetic video of a noncandidate public figure falsely praising a candidate — and asked whether the bill, as narrowed, would cover those tactics. Sponsors argued the narrower approach concentrates on two defensible government interests — protecting candidates from targeted defamation and protecting election integrity — and is more likely to survive judicial review than a broader content‑based restriction.
The committee did not take a final vote. Members agreed to reconvene at 3 p.m. with counsel and the House sponsor to attempt to reconcile the language on the definition, distribution, accessibility and the mental‑state standard before moving the bill toward the floor.
The committee went on a 10‑minute break after the S.23 discussion; no formal concurrence or amendment was reported at the meeting’s close.