A participant with policy research experience briefed the board on artificial intelligence tools and privacy risks for schools, urging caution when the district evaluates classroom AI products. The participant explained that some companies market education versions of AI software that promise not to use submitted data to train their public models, but that such assurances do not necessarily mean the data are private or encrypted. He told the board engineers at large‑model companies can access data that users send into these systems, and that school communities should not assume text entered into educational chatbots is protected like end‑to‑end encrypted messaging. Board members and staff discussed the need to review vendor contracts and seek tools that provide appropriate data‑use guarantees for students and teachers. Staff said the AI topic will remain under review and flagged the need to consider model selection, data handling, teacher training, and subscription‑level differences that may affect access to higher‑capacity tools. Ending: Staff said they will continue to evaluate AI products and their privacy guarantees before recommending classroom adoption.