The Big Why of Evaluation

The Spectre of Data Centers

Written by Jake Millette | January 9, 2025 at 1:56 PM

 

It seems every day I see news about the exploding growth of data centers, driven by artificial intelligence and crypto-currency mining. As AI gets integrated into everything (whether we want it or not), the energy required to do the same tasks multiplies. (A ChatGPT query uses about ten times more electricity than a traditional Google search – 2.9 Wh compared to 0.3 Wh, according to EPRI.) The data center boom could transform the electric grid, but there are major questions about extent to which it does so and how. Read on to learn more!

 The Numbers Are Scary

The estimated growth of the electric demand of data centers likely gives grid planners nightmares. For example, Georgia Power recently announced that its economic development pipeline is 36.5 GW in the mid-2030s, 31 GW of which will be data centers. For context, Georgia Power currently has about 14 GW of total generation (72% of which is fossil fuels). This means it will need to more than double its generation over the next decade to meet the new load from data centers!

 Virginia has just under 6 GW of data centers online today, expected to reach 25 GW by 2030. According to Dominion Energy, 94 data centers were connected from 2019 to 2024 (>4 GW of capacity). Additional 15 expected in 2024.

 According to recent EPRI research, of 22 surveyed utilities, new data center interconnection requests exceed current data center connections for all 22 respondents and more than 50% of system peak demand for 10 respondents. Of the responding utilities, all current data centers are <500 MW and most are <100 MW, but the new requests are often larger. Almost half of the utilities have requests for 1,000 MW or larger.

 Taken together, utility data center load forecasts may exceed 90 GW or about 10% of the total 2029 forecast load of 942 GW. (Grid Strategies) Given the huge infrastructure investments that will be needed to support this growth, do utilities have to provide power to these new data centers or can they say “no thanks?” Well, because they are monopolies, electric utilities have a legal “duty to serve” which requires them to provide service to all customers in their territory.[1] Additionally, the service must be “non-discriminatory” because different customers impose different costs on the system due to different locations and demand profiles (but they can have different rate structures, though.)

 But Are They Real?

It is hard to look at this massive expected load growth and not wonder if the planned data centers will actually get built. The interconnection process is long and unpredictable (even in Texas), so it would not be surprising if some companies are hedging their bets by making interconnection requests in multiple states and only building the data center in the location that gets approved first.

 It appears that there are questions even among industry insiders about whether the anticipated growth will actually materialize.[2] According to a Grid Strategies presentation, industry estimates of five-year demand growth from data centers range from 10 GW to 65 GW, but utility estimates are much higher. The presentation notes that Wood Mackenzie sampled several utilities’ announced data center demand, estimating 93 GW total through 2030. By comparison, they did a bottom-up analysis of tech industry capacity, and forecast a more much lower national five-year AI buildout to be just 23 GW.

 EPRI research found there was no consensus among responding utilities on how data center service requests are included in their load forecasts. Some included full requested capacity, but often ramp capacity over time. Some use derated capacity value based on weighting criteria. Some do not presently included DC requests in their load forecasts. They often considered factors such as whether the project was announce publicly, if land was acquired, and if it was a mature company.

 Utilities do have an incentive to overstate their expected load growth, but they also face tremendous risk if they overbuild infrastructure for data centers that never come. Interestingly, in Ohio, AEP is asking new data centers larger than 25 MW to pay for at last 85% of their expected load to cover cost of infrastructure needed to bring electricity to facilities, so other customers aren’t stuck covering the bill if the data center doesn’t materialize.

 Other Issues and Opportunities

 Thermal Energy Networks

 Data center efficiency, measured in power usage effectiveness (PUE) or the total data center energy consumption divided by the computing energy consumption, have been fairly stable for the past decade. This is not to say there hasn’t been improvement in the efficiency of servers or data center cooling, but rather that AI is extremely energy intensive and any efficiency improvements are immediately offset by cramming more computing power into the planned footprint (i.e., Jevon’s paradox).

 If efficiency is the silver bullet, then are there ways to mitigate the huge impacts of the growing energy consumption of data centers? One interesting solution is connecting data centers to thermal energy networks. These networks use a networked loop of pipes to transfer heat between buildings to provide heating, cooling, and hot water. Sources of heat can include water (lakes and rivers), the ground, or energy-intensive buildings like data centers. A pilot project in Massachusetts recently came online (although without a data center) and Washington state recently passed a bill that allows electric and gas utilities to establish thermal energy networks to sell thermal energy.

 Due to the high volume of waste heat from data centers (resulting from their massive cooling load), data centers are an obvious choice to include in thermal energy networks or district heating. However, there are currently few thermal energy networks connected to data centers in the US because of the low price of natural gas and the lack of thermal energy networks.

 This is too bad, because according to this article, “information technology equipment consumes about 52 % of the total electricity used in [data centers] and converts 97 % of this energy into waste heat. This high volume of waste heat is dissipated by cooling facilities, which consume about 30 %–40 % of the total electricity used in [data centers]. Typical temperatures of [data centers’] waste heat are 25 °C–35 °C for air-cooled systems, 50 °C–60 °C for water-cooled systems, and up to 90 °C for two-phase refrigerant-cooled systems.” This is a relatively low temperature level of waste heat due to the operating temperature limitation of IT equipment, so the heat quality will likely need to be boosted, ideally by heat pumps.[3]

 Program Attribution

 Energy efficiency programs often target new data centers because of their massive energy use and, consequently, immense savings opportunities. However, because data centers use so much energy, efficiency is often top of mind for facility owners and they do everything possible to limit energy costs. As a result, these programs can suffer from very low net-to-gross ratios due to limited program attribution. We have found that the timeline for data center projects is about three or four times longer than a typical large custom energy efficiency project, which results in issues with changing technology baselines and participants’ poor recollection of the program’s influence. If energy efficiency programs are going to continue to target data center projects, evaluators will need to address these issues.

 Climate Implications

 If all of the expected new data centers actually come online, then utilities will need to scale up large amounts of new generation. Ideally, these would come from clean energy sources, but solar and wind (even with batteries) may not provide data centers with enough reliable load for their needs and nuclear plants cannot be built fast enough to meet demand. Unfortunately, this means that utilities are either building new fossil fuel plants or postponing their retirements, which has large carbon impacts.[4]

 

 

[1] Most utilities’ rates are related to spending on infrastructure, so it is questionable whether a utility would turn this down even if they could.

[2] Check out the big swings in the price of Bitcoin over the past few years and consider what might happen to the planned crypto-mining facilities if that market drops steeply.

[3] If you’d like to learn more about the different configurations of waste heat recovery in data centers, this article has you covered!

[4] Some suggest carbon capture and storage (CCS) as a solution, but this technology is nowhere operating at the promised level. The first commercial-scale power plant with CCS (in Saskatchewan) has a capture rate of about 2/3 its promised 90% and another in Australia only removed about 30% of its carbon dioxide. Even if these CCS projects removed the anticipated 90% of carbon dioxide, it would still result in very large amounts of GHG emissions.