Dexter schools describe locally hosted AI chatbot, opt‑out option and classroom safeguards

Dexter Community School District Board of Education · November 5, 2025

Get AI-powered insights, summaries, and transcripts

Subscribe
AI-Generated Content: All content on this page was generated by AI to highlight key points from the meeting. For complete details and context, we recommend watching the full video. so we can fix them.

Summary

District staff presented an in‑house AI chatbot (DD/Didi) hosted on school servers, explained monitoring and disclosure practices, said families can opt students out, and described teacher exemplars and tools to detect AI use before submission.

District technology and instructional staff presented an update on the Dexter Community School District’s internally hosted AI chatbot (branded in presentation as "DD" or "Didi") and described the district’s approach to student safety, monitoring and classroom use.

Presenters said the district deliberately built a locally hosted model so student inputs do not leave district servers. "We host our own AI tool on our server that's not connected to the Internet," a staff presenter said, noting the district runs the model on local Mac minis at Creekside. The team explained that the internal tool does not train on student inputs, does not link to district databases, and is designed to avoid data leaks.

Staff described safeguards and complementary tools. The district uses Copyleaks to analyze submissions for AI‑generated content and is testing tools (Prompt Boot and Phoenix Arise) to stress‑test prompts and anonymously track query patterns. Presenters said those back‑end tools help staff identify hallucinations and adjust the model before pushing updates. "We're really following it every day to make sure that it's providing some information that's helpful for students, but not doing all of their writing for them," the presenter said.

The rollout and opt‑out policy were emphasized. District staff said DD is available to students in fifth through twelfth grades and that families may opt their child out of access. Presenters said teachers will specify expected AI use levels on assignments (ranging from no AI to AI‑assisted) and that disclosure statements will be used to teach academic practice consistent with higher‑education expectations.

Staff showed classroom examples and teacher prompts that limit output and promote student agency: the tool often asks follow‑up questions instead of providing finished answers, and teachers can provide scaffolds or teacher‑curated prompts for lessons. Presenters noted energy and cost savings from hosting the models locally, and they discussed model updates: the district moved from earlier Llama models to larger open models (cited as GBT variants) while stress‑testing each model before classroom use.

Dr. Liam Furze, an internal contributor the presenters named during the update, was credited with work on a responsible‑use rubric that staff plan to share with teachers; presenters described the rubric as a way to define acceptable AI levels for assessments.

Staff invited ongoing feedback from teachers, students and families and said the district will continue to refine guardrails, run prompt testing, and monitor usage patterns anonymously. The presentation emphasized teaching students how to evaluate outputs and disclose AI assistance rather than simply locking students out of AI tools.