Board members raise concerns over how state allocated $19 million for summer reading camps and which assessments were used
Get AI-powered insights, summaries, and transcripts
SubscribeSummary
The department released school-level allocations for nearly $19 million in summer reading camp funding; several board members and district leaders asked for clearer explanation of which screeners and cut scores were used to calculate each allocation.
The State Department of Education told the State Board it has allocated nearly $19 million in state funds for summer reading camps tied to the Literacy Act and that allocations were calculated from district-reported screeners.
Elizabeth, a department staff member who handled the summer-reading memo, explained that about $18.05 million (described in the meeting as “almost 19,000,000”) was appropriated by the legislature for summer reading and that the department used fall screening data districts uploaded into a statewide data entry tool (Caveon) to determine how the money would be distributed. “And it expended every you know, all almost 19,000,000,” Elizabeth said during the briefing.
Why it matters: districts rely on those allocations to plan summer staffing and programming. Multiple board members told the department they had already fielded calls from superintendents and principals seeking clarity on methodology and asked the department to publish a clearer, school-by-school breakdown and to open a channel for questions.
Points of dispute and requested clarifications: - Uniformity of screeners and cut scores: Several board members and district representatives said they believed different districts may have used different approved screeners or different cut scores when reporting non‑proficient counts, producing allocations that local leaders found inconsistent with local expectations. One board member requested that the state require a single assessment timeframe and common cut scores in future rounds to avoid cross-district variation. - Which data were used: The department said it used the fall (first) screener that districts uploaded, because it was the most recent common data set available; the department acknowledged that some districts also reported January or mid‑year data in prior years but said the summer allocations were based on the fall numbers entered for K–3. - Minimum staffing assumptions: the department said allocations assumed a 1:14 teacher/student ratio for camp planning and that they budgeted funding for at least four teacher slots in small schools to cover K–3, even if a school reported small total counts.
Department response and next steps: staff said they would send individual school and system allocation sheets to districts, provide a one‑page explanation of the calculation method and invite LEAs to contact the department with concerns about how their allocation was calculated. Staff also said the department is working on a per‑pupil supplemental allocation that districts could apply to curricula, local reading coaches or professional development.
Ending: Board members asked the department to follow up with a clarifying memo, a hotline or contact point for disputes and an annual roll‑out plan tied to PowerSchool migration so future allocations can be calculated directly from the state’s student information system rather than manual uploads.
