A single AI training campus can cause a city's power grid to swing by hundreds of megawatts in just a few seconds.
Massive AI training jobs break the fundamental load diversity assumption that power grids have relied on for a century. When thousands of GPUs start or stop at the same time, they create huge, synchronized power spikes that can destabilize local electricity. This scale of compute is no longer just an IT problem but a physical threat to the stability of the electrical grid. We now need a complete co-design of data centers and power grids to prevent city-wide blackouts. This shift means that the future of AI will be limited by the physics of power lines rather than the speed of chips.
From Barrier to Bridge: The Case for AI Data Center/Power Grid Co-Design
arXiv · 2605.03090
For over a century, the electric grid has relied on a single statistical assumption: \emph{load diversity}, the principle that the uncorrelated demands of millions of small consumers produce a smooth, predictable aggregate. AI training data centers break that assumption. A single hyperscale training campus can draw power comparable to a mid-sized city, driven by one tightly synchronized job whose demand swings by hundreds of megawatts in seconds. This paper argues that the resulting entanglement