Arizona committee backs tougher law to cover AI-generated child sexual images

2520948 · March 5, 2025

Get AI-powered insights, summaries, and transcripts

Subscribe
AI-Generated Content: All content on this page was generated by AI to highlight key points from the meeting. For complete details and context, we recommend watching the full video. so we can fix them.

Summary

The Senate Judiciary and Elections Committee voted 7-0 for House Bill 2678 after testimony from prosecutors, psychologists and law‑enforcement that AI-generated images indistinguishable from real minors are being used to exploit children and groom victims; the ACLU raised free-speech and overbreadth concerns.

House Bill 2678, which would expand Arizona’s child exploitation statute to criminalize computer‑generated or computer‑modified sexual images that are 'indistinguishable' from an actual minor, received a due‑pass recommendation from the Senate Judiciary and Elections Committee on Feb. 19, 2025. The committee vote was 7‑0 in favor.

Proponents told the committee the bill responds to advances in artificial intelligence that can produce photorealistic images and videos of children that are used to extort or groom victims. "These tools are being used in a variety of ways to exploit children," Rebecca Baker, legislative liaison for the Maricopa County Attorney’s Office, told the committee, and she submitted testimony from the National Center for Missing and Exploited Children into the record. Dr. Tina Garvey, a forensic and clinical psychologist who treats victims and studies offenders, described cases where AI images were created from online photos of real children and then used for sextortion and grooming.

Pinal County investigators described operational impacts. Jim Hurd of the Pinal County Sheriff’s Office said detectives sometimes spend significant investigatory time trying to locate a child who appears in an image only to discover the image was AI‑generated. Retired detective Randall Snyder cited FBI and nonprofit analyses showing steep increases in AI‑linked sextortion and child sexual exploitation reports.

Opponents urged narrowing the bill’s language to avoid criminalizing purely fictional or artistic material. Marilyn Rodriguez of Creaso Partners, speaking for the ACLU of Arizona, said HB 2678’s current wording could sweep in non‑person depictions and pointed to existing Arizona statutes that prohibit unlawful disclosure of identifiable images of a person; she recommended language that explicitly limits the offense to depictions of an actual identifiable minor.

The committee chair moved HB 2678 for a due‑pass recommendation; the roll call produced seven ayes and no nos. Several senators used the explanation‑of‑vote time to note constitutional concerns and the potential for litigation, and asked for continued stakeholder work before the bill reaches the floor.

Why it matters: Witnesses told the committee that AI tools are enabling new forms of child sexual exploitation that are already producing harm to minors, from sextortion and grooming to reputational and psychological damage. Supporters said the statute must be updated to give prosecutors tools to pursue cases where the image appears to show a real child even if it was produced or altered by software.

What’s next: HB 2678 now moves toward further floor consideration; committee members and testifiers indicated they expect additional drafting and stakeholder meetings before final votes.

Speakers quoted or referenced in this article are those who addressed HB 2678 at the committee hearing and appear in the record.