Committee hears bill to limit AI-driven discipline and school surveillance

House Education Committee · February 18, 2026

Get AI-powered insights, summaries, and transcripts

Subscribe
AI-Generated Content: All content on this page was generated by AI to highlight key points from the meeting. For complete details and context, we recommend watching the full video. so we can fix them.

Summary

Substitute SB 5,956 would bar automated decision systems from being the sole basis for student-discipline actions, prohibit many biometric inferences and restrict certain facial-recognition uses; sponsors and advocates framed it as protecting students and ensuring human oversight.

A bill seeking new guardrails for artificial intelligence in K-12 schools drew extensive testimony in the House Education Committee. Substitute SB 5,956 would prohibit school districts, charter schools and state tribal education compact schools from using an automated decision system as the sole or determinative basis for student-discipline decisions, and would restrict biometric inferences and many uses of facial-recognition services.

Megan Wargacki, counsel to the committee, summarized the bill's provisions: automated systems could not be the lone basis for emergency removal, suspension, expulsion, law-enforcement referrals, or assignment to alternative settings; biometric data could not be used to infer emotional or mental states or other sensitive characteristics; and OSPI would update guidance on human-centered AI and automated decision systems.

Sen. Tawanna Nobles, the prime sponsor, said the bill responds to incidents nationwide in which technology misidentified objects as weapons and led to law-enforcement involvement. Nobles said the bill aims to protect students, prevent discriminatory outcomes for students of color and students with disabilities, and keep credentialed adults responsible for disciplinary decisions.

Student and advocacy testimony backed the measure. Elias Ng, a senior, described an essay that an AI detector labeled as 30% AI and said such tools can harm nonnative English speakers; Derek Harris of the Black Education Strategy Roundtable urged rules that keep accountability with law and human review. Industry witnesses, including the Security Industry Association, said they did not oppose limiting AI for discipline but urged narrowly written exceptions so schools retain limited, clearly defined safety uses like locating missing students.

Committee members asked staff whether any Washington districts currently use the technologies for discipline; staff said they had not investigated deployments in detail. Members discussed the bill's scope, including whether it would apply to grading tools or other classroom uses, and staff clarified the bill is targeted to student-discipline contexts such as suspension, expulsion and emergency removal but may apply to risk scores tied to misconduct.

What happens next: Committee members signaled a need to tighten language around narrow safety exceptions; no formal vote occurred during the hearing.