State Bar of California builds grantee-centered AI program and explores shared cybersecurity offering

Conference session: Strategy to action — building AI capacity in legal aid · February 3, 2026

Get AI-powered insights, summaries, and transcripts

Sign Up Free
AI-Generated Content: All content on this page was generated by AI to highlight key points from the meeting. For complete details and context, we recommend watching the full video. so we can fix them.

Summary

The State Bar of California described the Legal Aid Justice Technology Collaborative, emphasizing baseline technology, a cybersecurity program modeled on Legal Services Corporation resources, an AI champion network, and tools — including a risk framework and maturity model — tailored to legal-aid values.

MJ Joyce Smith, representing the State Bar of California, described a new state-level effort to help legal-aid grantees adopt artificial intelligence safely and equitably. The program — the Legal Aid Justice Technology Collaborative (LAJTC) — will prioritize baseline technology, community leadership and practical tools for responsible AI adoption.

"The goal here is really for us to be grantee and client centered," Smith said, framing the initiative as bottom-up, not top-down. She told attendees the state bar administers IOLTA and state funding and that the Legal Services Trust Fund Commission oversees distribution of those funds.

Smith singled out baseline technology and cybersecurity as prerequisites for safe AI use. The State Bar is "currently exploring a statewide cybersecurity program that would be modeled off of the Legal Services Corporation" offering, she said, and is identifying free and low-cost resources for grantees. The bar is also considering changes to state funding rules that would require additional cybersecurity steps for grantees.

To build leadership and peer learning, LAJTC will create an advisory panel primarily composed of grantees and launch an AI champion network for peer-to-peer mentoring. Smith said the bar will also run monthly tech-and-AI workshops to highlight successful projects across California.

On tools, Smith described two products under development: a tailored risk framework for legal-aid values and a maturity model that lets programs assess where they are on an AI adoption spectrum and identify appropriate next steps. She emphasized that these tools are designed for legal-aid programs rather than private-sector contexts.

Next steps include formalizing the advisory panel, piloting the cybersecurity model and making the risk framework and maturity model available to grantees. Smith encouraged grantees to engage with the monthly workshops and to consult partner organizations for technical assistance.