Lawmakers hear experts: AI data centers create fast, uncertain load that requires new planning tools

California State Assembly (joint hearing: Utilities & Energy; Privacy & Consumer Protection) · January 28, 2026

Get AI-powered insights, summaries, and transcripts

Subscribe
AI-Generated Content: All content on this page was generated by AI to highlight key points from the meeting. For complete details and context, we recommend watching the full video. so we can fix them.

Summary

Experts at a California State Assembly joint hearing warned that rapid growth of AI data centers creates planning uncertainty, urging stochastic and integrated approaches to transmission, generation and storage and interim flexible connection strategies to avoid stranded assets and reliability risks.

Assembly leaders convened a joint informational hearing to examine the energy implications of AI data center build‑out and to press state agencies for planning and operational solutions.

Dr. Nate Gleason of Lawrence Livermore National Laboratory opened the session with a technical framing: data centers can be built in months but require transmission that may take a decade, producing wide uncertainty for planners. "Data centers represented 4.4% of total U.S. electricity consumption in 2024," he said, adding that estimates vary and that stochastic optimization and co‑optimization of generation, storage and transmission can produce plans robust across many futures.

Why it matters: California’s planning systems rely on forecasts to schedule expensive transmission and generation projects. If forecasts overestimate future load, ratepayers can be left paying for stranded infrastructure; if forecasts underestimate growth, the state risks price spikes and reliability shortfalls.

Alicia Gutierrez, director of the California Energy Commission’s Energy Assessments Division, described the CEC’s bottom‑up, project‑based forecasting process. She said the CEC translates utility energization requests into expected maximum demand using utilization factors, project confidence levels and hourly load profiles drawn from interval meter data. Gutierrez noted that data centers currently account for about 1,000 megawatts statewide—roughly 2% of CAISO peak demand—and under the planning forecast could rise to about 9% by 2040; a local reliability scenario raises that share further.

CAISO and utilities described steps they are taking. Neil Miller of the California Independent System Operator said transmission projects already approved and under construction reflect recognition of the new loads, noting four projects representing about $3.1 billion in capital. PG&E and Silicon Valley Power described cluster study approaches—studying many interconnection requests together to reduce redundant upgrades and speed energization. Silicon Valley Power said it manages unusually high energy density in its small service territory (about 37 megawatts per square mile) and uses substation agreements that ramp service to avoid sudden load shocks.

Panelists emphasized two classes of near‑term tools: (1) flexible service connections and interim energization mechanisms that allow a customer to take reduced service while priority transmission work is completed, and (2) operational flexibility programs—market or contractual incentives for data centers to curtail or shift compute loads during high‑stress periods. Agencies warned those two concepts are distinct but complementary: one is about the physical wires and near‑term energization, the other is about day‑to‑day system balancing and market participation.

What’s next: The CPUC indicated it has issued a proposed decision standardizing flexible service connections and expects a vote next month; CAISO plans a California‑centric stakeholder issue paper; the CEC continues to refine demand forecasts through its integrated energy policy report process. Lawmakers pressed agencies for better out‑year visibility because transmission takes many years to build and mis‑timing can be costly.