Citizen Portal
Sign In

Get AI Briefings, Transcripts & Alerts on Local & National Government Meetings — Forever.

Senate Criminal Justice Committee advances five‑bill package tightening law on AI-generated child sexual material and deepfakes

Committee on Criminal Justice · March 4, 2024

Loading...

AI-Generated Content: All content on this page was generated by AI to highlight key points from the meeting. For complete details and context, we recommend watching the full video. so we can fix them.

Summary

The Committee on Criminal Justice advanced five related bills that expand criminal liability for AI‑generated depictions of minors, create civil remedies for nonconsensual AI intimate material, narrow affirmative‑defense carveouts for harmful materials shown to minors, and clarify penalties; invited prosecutors and law‑enforcement witnesses urged drafting changes and additional tools for enforcement.

The Senate Committee on Criminal Justice on [date not specified in transcript] advanced a package of five bills aimed at addressing the use of artificial intelligence to create sexualized images and videos depicting children and nonconsensual deepfakes. Committee members voted to report each bill favorably to the full Senate after invited testimony from prosecutors, law‑enforcement officials and a lengthy public comment period.

Chairman Flores opened the hearing by saying its purpose was “to take up and consider items related to AI child … abuse material, deep fakes, and affirmative defense,” and described the package as an effort to “protect children from predators, producers, promoters, and enablers of this offensive material.”

The measures include: - Senate Bill 20 (Flores): creates a new offense (added as proposed section 43.235) that makes it a state‑jail felony to knowingly possess, access with intent to view, or promote obscene visual material that appears to depict a child under 18, including images generated by AI, cartoons or animations. The bill includes enhancements for prior convictions that can elevate penalties to third‑ or second‑degree felonies. - Senate Bill 16 21 (Huffman): rewrites portions of the child‑protection statute to explicitly cover computer‑generated depictions and creates definitions for a depiction of a child and a computer‑generated child; distinguishes penalties for real‑child depictions and AI‑generated depictions and provides enhancements for victims under age 10 or large quantities of material. - Senate Bill 4 41 (Hinojosa): a civil‑liability bill that permits victims to sue individuals or entities that produce, solicit, disclose or promote artificial intimate visual material without the depicted person’s consent; defines terms such as “artificial intimate visual” and “notification application,” requires website operators/payment processors to remove material after a removal request within 72 hours or face liability, and allows confidential filings with a 10‑year statute of limitations measured from discovery or turning 18. - Senate Bill 4 42 (Hinojosa): the criminal companion that criminalizes producing or distributing deepfake intimate media without explicit, informed, written consent; classifies nonconsensual production or distribution as a class A misdemeanor with enhanced penalties for repeat offenders and removes protections based solely on disclaimers of authenticity. The bill includes exceptions for law‑enforcement operations, medical treatment, reporting unlawful activity and legal proceedings. - Committee substitute for Senate Bill 4 12 (Middleton): narrows affirmative defenses that currently permit sale, distribution, or exhibition of harmful material to minors in certain contexts; the substitute limits affirmative‑defense carveouts primarily to judicial or law‑enforcement officers performing official duties, with the stated aim of closing broad educational, medical and scientific defenses that supporters say have been used to justify distribution to minors.

Law‑enforcement and prosecutors backed the package but raised drafting and enforcement concerns. Captain Steven Stone of the Texas Department of Public Safety’s cybercrime and digital‑forensics unit told the committee the bills are “a good start” and that the revised definitions let law enforcement act without having to identify an actual child when coping with AI‑generated material: “If a reasonable person looks at that image and says, ‘Yep, that’s a child,’ then we’re able to take actions.”

Prosecutors and district‑attorney representatives urged preserving prosecutorial discretion to charge overlapping offenses. Shannon Edmonds of the Texas District and County Attorneys Association asked the committee to avoid drafting that would force prosecutors to choose “one or the other, but not both,” and requested language allowing multiple theories of prosecution when applicable.

Committee members and witnesses repeatedly raised enforcement and technical‑capacity questions. Several speakers warned that tools to detect whether imagery is AI‑generated are limited and expensive; sheriffs and prosecutors noted rural counties often lack subscription access to sophisticated forensic software and called for resource support. HPD Sergeant Heidi Ruiz described her frontline experience: she said she has worked hundreds of child‑exploitation cases and expressed support for the bills as practical tools for investigators.

The bills also prompted debate about constitutional and educational implications. Opponents and some witnesses cautioned that the proposed narrowing of affirmative defenses could sweep too broadly, potentially chilling legitimate educational, medical or scientific uses of material. Sarah Worbelot testified that obscenity and child‑sexual‑abuse statutes are distinct and that the Miller obscenity test complicates bright‑line rules for classroom or therapeutic contexts. Supporters contended the current carveouts are too broad and have been used to expose minors to harmful material.

Several specific policy details emerged during questioning: Senator Hinojosa cited research that “96% of deepfake videos online are pornographic” and a separate claim that searches for deepfake sexual imagery increased dramatically in recent years; sponsors noted they would consider drafting fixes and technical clarifications (for example, adding digital file language to older statutory references that still use “videotape or film”). The civil bill (SB 441) contains a 72‑hour removal requirement after a removal request and a 10‑year statute of limitations measured from discovery or age‑18, and the criminal bills include penalty‑stacking and enhancement language.

Votes at a glance: - SB 20 (Flores): motion to report favorably carried — roll call recorded 6 ayes, 1 present/not voting; reported to the full Senate with a favorable recommendation. - SB 16 21 (Huffman): motion to report favorably carried — reported to full Senate with a favorable recommendation. - Committee substitute for SB 4 12 (Middleton): committee substitute adopted and reported favorably. - SB 4 41 (Hinojosa) and SB 4 42 (Hinojosa): each reported favorably to the full Senate.

What’s next: committee members said they expect to continue refining statutory language and consider floor amendments addressing drafting concerns raised by prosecutors and constitutional questions raised by public witnesses. The committee stood recessed subject to the call of the chair after recording votes favorable to moving the bills to the full Senate.

Sources and attributions in this article come from committee remarks, witness testimony, and roll‑call votes recorded during the Committee on Criminal Justice hearing as reflected in the official transcript. Direct quotes are attributed to the speakers who made them during the hearing.