Committee hears bill to require provenance tools, disclosures for certain generative AI systems
Get AI-powered insights, summaries, and transcripts
SubscribeSummary
Staff told the House Appropriations Committee that second-substitute HB 1170 would require covered generative-AI providers to offer provenance-detection tools and include latent or manifest disclosures in AI-generated images, audio and video; the Office of the Attorney General would enforce the measure under the Consumer Protection Act and estimated enforcement costs range from tens to hundreds of thousands of dollars in early years.
Staff briefed the House Appropriations Committee on second-substitute House Bill 1170, a measure aimed at increasing transparency for certain generative artificial-intelligence systems. Emily Poole, staff to the Technology, Economic Development and Veterans Committee, said the bill would require covered providers — defined in the bill by computing power and revenue thresholds — to make a provenance-detection tool available to users and to include both latent disclosures and optional manifest disclosures in AI-generated output.
"The tool must allow a user to assess whether content was created or altered by the covered provider system," Poole said during the staff briefing. Exemptions in the bill would exclude video games and other interactive experiences from the disclosure rules.
Jessica Van Horn, committee staff, outlined the fiscal and enforcement picture: the bill delegates enforcement to the Office of the Attorney General under the Consumer Protection Act and staff provided a range of estimated costs. "A reasonable range of costs for the bill would be between $66,000 to $311,000 in fiscal year 2028 and $130,000 to $623,000 in fiscal year 2029," Van Horn said, noting an earlier estimate for intensive early enforcement of about $623,000 per fiscal year if industry noncompliance were high.
Representative Penner raised whether the measure would require disclosures to be embedded in AI-generated code. Staff clarified the bill "does not apply to text; it applies to images, image, video, or audio content."
Industry testimony during the public hearing raised concerns about enforcement language and definitions. Amy Harris, director of government affairs for the Washington Technology Industry Association, said WTIA "appreciate[s] the intent of this bill and share[s] the goal of improving trust and accountability, but we do have a few priority areas," including returning to prior enforcement frameworks and reverting to earlier, more widely used AI definitions to reduce compliance uncertainty.
The committee did not take action on HB 1170 during the hearing. The public hearing concluded and the committee moved on to other bills. The staff fiscal notes and the committee record are available in the electronic bill book for members seeking more detail.
