Data centers consume massive amounts of energy
The digital world that is so integral to our daily lives depends on data centers. They are vast warehouse-like structures packed with racks of servers, routers, storage devices, and other equipment. Nearly everything we do with technology—whether it's internet searches, streaming, online shopping, or gaming—depends on these buildings.
Even before the rise of AI, these data centers required enormous amounts of energy. The International Energy Agency (IEA) estimates that data centers account for 1%-1.5% of total global electricity use. To put that in perspective, data centers consume about two to three times more energy than the six states of New England combined (they average 120-125 terawatt hours (TWh) of energy each year). For reference, a terawatt hour equals 1 trillion watt hours (Wh).
AI will cause data centers to consume even more energy
AI is going to increase both the amount and the rate at which data centers consume energy. AI models use power at a higher rate than traditional data center activities. For example, a ChatGPT query uses 10 times more energy than a normal Google search. The IEA has already forecast that global data center electricity demand will more than double from 2022 to 2026.
Two AI-related activities put enormous energy demands on data centers: AI training and systems cooling. First, let’s address AI training. The amount of energy needed for this process depends on the AI model being developed. A 2022 study revealed that training energy requirements for models varied significantly, ranging from 2-3 kilowatt hours (kWh) for smaller natural language processing and computer vision models to a staggering 103,500 kWh for a 6 billion-parameter transformer. To put that in perspective, GPT-3, the older version of ChatGPT which came out in 2022, used almost 1,300 MWh (one MWh equals 1,000 kWh) of electricity to power its 175 billion-parameter AI model. As Joseph Polidoro notes, that's "roughly equal to the annual power consumption of 130 homes in the U.S."
Now let’s address AI cooling. AI computing is generally done with hotter-running microprocessing chips which require more energy than normal chips. About 25%-40% of the energy used by traditional data centers is consumed by HVAC systems (Heating, Ventilation, and Air Conditioning) and about 50% is consumed by other systems and equipment. According to researchers at U.C. Riverside, the rise in AI use will cause data centers to consume more than 1 trillion gallons of fresh water by 2027.
The number of AI data centers is growing worldwide
Currently, there are 9,000 to 11,000 data centers worldwide across every continent except Antarctica, and their numbers are growing. CBRE, the global commercial real estate services and investment firm, reports that all major global regions saw annual gains in data center inventory in Q1 2024. Year-over-year, inventory grew by 24.4% in North America, 22% in Asia, 20% in Europe, and 15% in Latin America.
In the U.S., top data center hubs include Northern Virginia, Silicon Valley, Dallas/Fort Worth, and Chicago.
Worldwide, here are the top 5 data center markets, ranked in order of megawatt usage:
- Northern Virginia
- London
- Tokyo
- Frankfurt
- Sydney
Source: CBRE Research, Q1 2024
Requirements for optimal AI performance
Data centers require stable power grids, dense fiber networks, and proximity to internet “backbone points.” Beyond these technical needs, location also hinges on government and public support. Countries like the United States, with efficient permitting processes and data-friendly regulations, are prime candidates for AI facilities. Even typically cautious countries like France are actively pursuing AI data centers.
However, the London School of Economics notes the rise of "data center activism," with global protests aiming to halt new construction. Concerns driving this activism include the environmental impact on climate change, land allocation, the allegedly minimal community benefits of data centers, and the high water usage needed for cooling.
Hyperscale data centers are the future
Now that tech firms are entirely convinced that AI is the future of computing, they are in the business of constructing hyperscale data centers. All the major tech firms—Google, Meta, Microsoft, Amazon, Oracle, etc.—are spending massive amounts on hyperscaler construction.
Hyperscalers are much bigger than standard data centers. They take up 1 million to 2 million square feet compared to 100,000 square feet for the average cloud data center. In addition to housing more equipment, hyperscalers are more energy-efficient by design. 40% of all hyperscalers in the pipeline will be built in the United States alone. Tech industry market research firm 650 Group "calculated that the top 5 US hyperscaler developers spent $105 billion in 2023 and forecast the figure to rise to $187 billion in 2028," according to Bloomberg.
In their most recent earnings reports, Microsoft, Meta, and Amazon indicated that they’re increasing spending on data centers to augment AI development. Amazon said it spent $30.5 billion in the first half of 2024 and "pledged to exceed that figure over the next six months." Almost all of it is going to data centers.
Interestingly, Microsoft is considering using nuclear power to run its AI data centers, possibly using an array of small modular reactors. The company has also agreed to purchase power from Helion, which is trying to build a fusion power plant.
|