Dominion Energy in Virginia paused new data center interconnection approvals earlier this month, telling customers that the queue now exceeds what the grid can physically deliver over the next five years. ERCOT in Texas is signaling similar constraints in the Dallas Fort Worth corridor. Arizona Public Service has pushed out timelines on major projects in the West Valley. Georgia Power is warning that its load forecast has nearly doubled in the last 24 months. Four different utilities in four different states are all telling the same story about the same customer class.
That customer class is AI training and inference facilities. A single hyperscale data center built for frontier model training can pull between 250 and 500 megawatts of continuous load. For reference, a medium sized city with 500,000 residents typically draws around 800 megawatts at peak. We are now building facilities that represent half a city worth of electricity demand on a single campus, and the hyperscalers want to build dozens more of them per year.
The numbers being published by grid operators are striking. PJM, which covers 65 million people across the mid Atlantic, raised its 15 year load forecast by 40 percent last year. ERCOT revised its five year peak demand projection up by 62 gigawatts, roughly the equivalent of adding the entire state of Florida to the Texas grid. MISO, serving the Midwest, moved its 2030 demand forecast up by 24 percent. These are not marginal adjustments. They are the kind of revisions that force utilities to rebuild their capital plans from scratch.
The physics problem is straightforward. You cannot conjure electricity. You have to generate it, transmit it, and distribute it, and each of those layers takes years to build. A new natural gas combined cycle plant takes four to six years from planning to commercial operation. A new nuclear plant takes 10 to 15. High voltage transmission lines across state boundaries take even longer because they require regulatory approval in every jurisdiction they cross. Data centers, on the other hand, can be built in 18 to 24 months. The supply side cannot keep up with the demand side.
The workaround most utilities have used in the last decade was renewables plus battery storage. That is still happening at scale, but it is not enough to meet continuous AI load on its own. Training workloads run at close to full utilization 24 hours a day. They do not match the intermittent output profile of solar and wind. Battery storage is getting cheaper but four hour batteries cannot firm overnight demand from a facility that never turns off. You need firm power. That means gas, nuclear, geothermal, or grid scale storage that does not exist at the required duration yet.
This is why you are hearing so much about nuclear right now. Amazon signed a power purchase agreement with a nuclear plant in Pennsylvania last year. Microsoft is backing the restart of Three Mile Island Unit 1 for data center load. Meta has a deal with Entergy for up to 4 gigawatts. Google has a portfolio of small modular reactor contracts that will start coming online in 2029 or 2030. Hyperscalers have essentially decided that nuclear is the only way to guarantee firm clean power at the scale they need, and they are willing to pay premiums that utilities could not get from any other customer class.
The tension is that small modular reactors are not operating commercially yet in the United States. The first utility scale SMR project in Idaho was cancelled in 2023 due to cost overruns. The designs coming from NuScale, X Energy, and TerraPower are making progress but most are still in licensing. Commercial operations at scale are at least five years out. That leaves a gap between now and then where data centers need power that does not exist yet.
The short term answer for a lot of projects is behind the meter natural gas. Data center developers are building their own gas turbines on site rather than waiting for the utility to deliver power through the grid. That works from a physics standpoint but it complicates emissions commitments that most of these companies made over the last decade. It also raises questions about who pays for the stranded gas infrastructure when nuclear eventually comes online.
State policy is catching up. Virginia created a special commission in February to recommend reforms to data center siting rules. Texas is considering legislation that would require large loads to bring their own generation capacity before getting interconnection approval. Arizona is debating a utility rate case that would pass data center infrastructure costs directly to those customers rather than spreading them across residential ratepayers.
That last point is worth watching. The political pressure is building in communities where residents see electric bills rising while hyperscalers build facilities that use more power than the surrounding city. Some of the most politically vulnerable decisions local leaders will make in the next two years will involve how to balance economic development dollars from data center projects against the grid reliability and affordability concerns of ratepayers who did not ask for any of this.
The AI buildout is real. The power problem underneath it is also real, and it is not going to be solved by a press release. The companies that figure out how to secure firm, affordable, clean power over the next five years will have a durable advantage. The ones that do not will find their timelines slipping whether they want them to or not.