Citizen Portal

District pilots show practical gains; witnesses push states and feds to fund teacher training and vetting

2916234 · April 1, 2025

Get AI-powered insights, summaries, and transcripts

Subscribe
AI-Generated Content: All content on this page was generated by AI to highlight key points from the meeting. For complete details and context, we recommend watching the full video. so we can fix them.

Summary

Superintendents and technologists told a House subcommittee that local pilots—some using district‑hosted AI models—have improved grading efficiency and individualized plans, but scaling requires professional development, cybersecurity support and vetted tools.

School leaders testifying at a House Education and Labor subcommittee hearing described district pilots that used AI agents and local hosting to protect student data while improving educator efficiency and individualized supports.

Chris Chisholm, superintendent of the Pearl Public School District in Mississippi, described creating a local AI agent trained on the district’s writing rubric that helped a veteran English teacher reduce nightly grading time and stay in the classroom. "We created an agent for her. We uploaded the state writing rubric ... and we created an agent for her to use in the classroom," Chisholm said, adding that the district hosts its model on an internal server to avoid FERPA concerns.

Witnesses stressed that teacher readiness and professional development are central to realizing AI’s benefits. Erin Moe described the EdSafe AI Alliance’s SAFE framework—safety, accountability, fairness and transparency—and urged federal and state support for sustained professional development. Dr. Sid Dobrin emphasized AI literacy that explains how models work so teachers and students can apply critical thinking and keep “humans in the loop.”

Panelists discussed costs and implementation models. Chisholm said a district‑hosted server is expensive but can be more cost‑effective than paying per‑use fees for commercial APIs at scale; other witnesses encouraged public funding to help underresourced districts. Several members and witnesses recommended state‑level assurance labs to vet vendors and share research about which modalities and deployments (text tutors, immersive experiences, localized models) are most effective.

Witnesses also highlighted special‑education use cases: Chisholm and Dr. Rafalvaire described using assessment data to generate individualized plans and screeners; early research (cited from county pilots) suggested AI‑assisted tutoring benefited tutors at multiple skill levels, with disproportionate gains for lower‑and middle‑performing tutors.

Committee members asked practical questions about privacy safeguards, the need to scrub personally identifiable information before using commercial models, and how districts can prioritize classroom activities that are less vulnerable to off‑site AI misuse (for example, moving more assessment and project work into supervised class time). No formal policy was decided at the hearing; witnesses and members urged funding for teacher training, interoperability and cybersecurity supports to scale promising pilots equitably.