Board members press for clarity on midyear benchmarks; district outlines use of DIBELS and i‑Ready for targeted instruction

2378989 · February 18, 2025

Loading...

AI-Generated Content: All content on this page was generated by AI to highlight key points from the meeting. For complete details and context, we recommend watching the full video. so we can fix them.

Summary

In a lengthy discussion, district leaders explained how DIBELS and i‑Ready benchmarks and progress monitoring guide interventions, teacher collaboration, and individual student plans; board members sought clarification about different test levels appearing within the same grade.

Acadia Parish School Board members on Feb. 25 pressed district staff for clarity about midyear benchmark assessment results and how the district uses DIBELS and i‑Ready data to guide instruction and interventions.

Board member Mister Seymour asked whether all students take the “same test” at midyear after observing that third‑grade students in some classes appeared to be taking different items. District staff explained that the assessments start with identical items for all students, and that i‑Ready (used in grades 4–8 for ELA and K–8 for math) is an adaptive, computer‑based diagnostic that adjusts item difficulty up or down as a student responds. Staff said DIBELS is required by the State Department of Education for K–3 literacy screening and that it is also used for grades 4–5 for literacy measures.

District staff walked the board through how the assessments are used: the benchmark is administered three times a year, and teachers also conduct frequent progress monitoring (shorter probes or “exit tickets”) every few weeks to measure response to instruction. Staff described how schools use i‑Ready’s correlation to LEAP to set individualized goals; one example cited was Armstrong Middle School, which staff said saw significant LEAP growth after using i‑Ready data to set student goals and instructional plans.

A board member asked about the interpretation of percentages reported (for example, a school’s beginning‑of‑year 8% at benchmark rising to 34% at midyear). Staff explained the percentages reflect the share of students at or above the benchmark level (based on national norms and the test’s adaptive scoring) and that higher‑achieving students can influence school percentages. Staff also described how schools create differentiated groups during RTI/response‑to‑instruction times — with some students receiving acceleration, some receiving core instruction, and others receiving targeted remediation — and how lead teachers, program facilitators and supervisors provide in‑class coaching and modeling.

Board members commended the level of “boots‑on‑the‑ground” support in classrooms for struggling teachers and students, and staff emphasized the district’s use of progress monitoring, teacher collaboration, and targeted interventions to reduce surprises at the end of the year and to avoid promoting students who are not ready for the next grade level.

District staff also noted the Department of Education is developing a numeracy screener that will become required for K–3, at which point the district’s use of i‑Ready for that group may change. Staff said principals may request other assessments if i‑Ready is not capturing needed data.