Maryland Legal Services Corporation shifts grantee reporting to row-level case data to improve analysis
Summary
Maryland Legal Services Corporation told grantees and stakeholders it has moved from aggregate reporting to row-level case data to reveal uneven outcomes, support equity analysis, and improve funder and grantee decision-making. The rollout required pilots, technology grants, Tableau dashboards and an Azure data warehouse.
Maryland Legal Services Corporation announced a multi‑year change in how it collects reporting from legal aid grantees, moving from aggregate tables to row‑level case data intended to preserve relationships between fields and allow more detailed geographic and demographic analysis. The change, staff said, is meant to help the funder fulfill statutory responsibilities and to give grantees better information for their own program decisions.
MLSC staff and contractors said the shift began with pilot projects and months of planning. The process began with a February 2024 notice to grantees, followed by pilot testing, the release of a standardized data template and data mapping guidance, and technology grants to help organizations update their case management systems. By Jan. 28, 2026 the presenters reported receiving row‑level submissions from roughly 48 grantees, comprising thousands of case records.
Presenters described a two‑part submission: a data dump (Excel or CSV exports from a grantee’s case management system) plus a data map that translates that vendor’s header names and field codes into MLSC’s standardized values. That mapping approach is intended to minimize repeated effort for grantees: once a map is created, subsequent exports can be translated and ingested automatically.
The technical flow uses Tableau Cloud for processing and dashboards and will write standardized outputs to an Azure‑hosted data warehouse, presenters said. The public demo used dummy data; the dashboards support county and jurisdictional drilldowns, show primary KPIs (case counts, service level, major benefits) and provide an error‑tracking dashboard that highlights unmapped or inconsistent values to help grantees fix issues.
MLSC emphasized risk‑mitigating steps: a dual reporting year (collecting both old aggregate reports and the new row‑level files while the system stabilizes), multiple rounds of technology grants to support grantees’ system updates, and a program of one‑on‑one support and procedure guides. Staff said they also sought and obtained board approval and one‑time funding to cover the project’s upfront costs.
Presenters cautioned that row‑level data also reveal inconsistencies and divergent interpretations across grantees, requiring expanded glossary definitions and ongoing data quality work. They recommended early and frequent communication, clear feasibility thresholds, and starting pilots among grantees that use similar case management systems to reduce variability.
Next steps include continuing to ingest and validate incoming data, refining maps and dashboards, and returning analysis to grantees. Presenters said they hope disaggregated data will eventually support intersectional analyses (for example, outcomes by race, gender and geography) and inform funder decisions about resource allocation once the data quality is assured.

