Can energy-hungry AI help cut our energy use?

Credit: Brett Sayles from Pexels

It takes 10 times more electricity for ChatGPT to respond to a prompt than for Google to carry out a standard search. Still, researchers are struggling to get a grasp on the energy implications of generative artificial intelligence both now and going forward.

Few people realize that the carbon footprint of digital technology is on par with that of the aerospace industry, accounting for between 2% and 4% of global carbon emissions. And this digital carbon footprint is expanding at a rapid pace. When it comes to power use, the approximately 11,000 data centers in operation today consume just as much energy as the entire country of France did in 2022, or around 460 TWh. Will the widespread adoption of generative AI send those figures soaring?

The new technology will clearly affect the amount of energy that’s consumed worldwide, but exactly how is hard to quantify. “We need to know the total cost of generative AI systems to be able to use them as efficiently as possible,” says Manuel Cubero-Castan, the project manager on Sustainable IT at EPFL.

He believes we should consider the entire life cycle of generative AI technology, from the extraction of minerals and the assembly of components—activities whose impact concerns not only energy—to the disposal of the tons of electronic waste that are generated, which often gets dumped illegally. From this perspective, the environmental ramifications of generative AI go well beyond the power and water consumption of data centers alone.

The cost of training

For now, most of the data available on digital technology power use relates only to data centers. According to the International Energy Agency (IEA), these centers (excluding data networks and cryptocurrency mining) consumed between 240 TWh and 340 TWh of power in 2022, or 1% to 1.3% of the global total. Yet even though the number of centers is growing by 4% per year, their overall power use didn’t change much between 2010 and 2020, thanks to energy-efficiency improvements.

With generative AI set to be adopted on a massive scale, that will certainly change. Generative AI technology is based on large language models (LLMs) that use power in two ways. First, while they’re being trained—a step that involves running terabytes of data through algorithms so that they learn to predict words and sentences in a given context. Until recently, this was the most energy-intensive step.

Second, while they’re processing data in response to a prompt. Now that LLMs are being implemented on a large scale, this is the step requiring the most energy. Recent data from Meta and Google suggest that this step now accounts for 60% to 70% of the power used by generative AI systems, against 30% to 40% for training.

ChatGPT query vs. conventional Google search

A ChatGPT query consumes around 3 Wh of power, while a conventional Google search uses 0.3 Wh, according to the IEA. If all of the approximately 9 billion Google searches performed daily were switched to ChatGPT, that would increase the total power requirement by 10 TWh per year.

Goldman Sachs Research (GSR) estimates that the amount of electricity used by data centers will swell by 160% over the next five years, and that they will account for 3% to 4% of global electricity use. In addition, their carbon emissions will likely double between 2022 and 2030.

According to IEA figures, total power demand in Europe decreased for three years in a row but picked up in 2024 and should return to 2021 levels—some 2,560 TWh per year—by 2026. Nearly a third of this increase will be due to data centers. GSR estimates that the AI-related power demand at data centers will grow by approximately 200 TWh per year between 2023 and 2030. By 2028, AI should account for nearly 19% of data centers’ energy consumption.

However, the rapid expansion of generative AI could wrong-foot these forecasts. Chinese company DeepSeek is already shaking things up—it introduced a generative AI program in late January that uses less energy than its US counterparts for both training algorithms and responding to prompts.

Another factor that could stem the growth in AI power demand is the limited amount of mining resources available for producing chips. Nvidia currently dominates the market for AI chips, with a 95% market share. The three million Nvidia H100 chips installed around the world used 13.8 TWh of power in 2024—the same amount as Guatemala. By 2027, Nvidia chips could burn through 85 to 134 TWh of power. But will the company be able to produce them at that scale?

Not always a sustainable choice

Another factor to consider is whether our aging power grids will be able to support the additional load. Many of them, both nationally and locally, are already being pushed to the limit to meet current demand. And the fact that data centers are often concentrated geographically complicates things further. For example, data centers make up 20% of the power consumption in Ireland and over 25% in the U.S. state of Virginia. “Building data centers in regions where water and power supplies are already strained may not be the most sustainable choice,” says Cubero-Castan.

There’s also the cost issue. If Google wanted to be able to process generative AI queries, it would need to set up 400,000 additional servers—at a price tag of some 100 billion dollars, which would shrink its operating margin to zero. An unlikely scenario.

Untapped benefits

Some of the increase in power consumption caused by generative AI could be offset by the benefits of AI in general. Although training algorithms requires an investment, it could pay off in terms of energy savings or climate benefits.

For instance, AI could speed the pace of innovation in the energy sector. That could help users to better predict and reduce their power use; enable utilities to manage their power grids more effectively; improve resource management; and allow engineers to run simulations and drive advances at the leading edge of modeling, climate economics, education and basic research.

Whether we’re able to leverage the benefits of this kind of innovation will depend on its impacts, how extensively the new technology is adopted by consumers, and how well policymakers understand it and draft laws to govern it.

The next-generation data centers being built today are more energy efficient and allow for greater flexibility in how their capacity is used. By the same token, Nvidia is working to improve the performance of its chips while lowering their power requirement.

And we shouldn’t forget the potential of quantum computing. When it comes to data centers, the IEA calculates that 40% of the electricity they use goes to cooling, 40% to running servers and 20% to other system components including data storage and communication.

At EPFL, Prof. Mario Paolone is heading up the Heating Bits initiative to build a demonstrator for testing new cooling methods. Five research groups and the EcoCloud Center have teamed up for the initiative, with the goal of developing new processes for heat recovery, cogeneration, incorporating renewable energy and optimizing server use.

Keeping the bigger picture in mind

Another (painless and free) way to cut data centers’ power use is to clear out the clutter. Every day, companies worldwide generate 1.3 trillion gigabytes of data, most of which ends up as dark data, or data that are collected and stored but never used. Reseadrchers at Loughborough Business School estimate that 60% of the data kept today are dark data, and storing them emits just as much carbon as three million London–New York flights. This year’s Digital Cleanup Day was held on 15 March, but you don’t have to wait until spring to do your cleaning!

Cubero-Castan warns us, however, to keep the bigger picture in mind: “If we begin using generative AI technology on a massive scale, with ever-bigger LLMs, the resulting energy gains will be far from enough to achieve a reduction in overall carbon emissions. Lowering our usage and increasing the lifespan and efficiency of our infrastructure remain essential.”

The energy impact of generative AI mustn’t be overlooked, but for now it’s only marginal at the global level—it’s simply adding to the already hefty power consumption of digital technology in general. Videos currently account for 70% to 80% of data traffic around the world, while other major contributors are multiplayer online games and cryptocurrency. The main drivers of power demand today are economic growth, electric vehicles, air-conditioning and manufacturing. And most of that power still comes from fossil fuels.

Provided by
Ecole Polytechnique Federale de Lausanne


Citation:
Can energy-hungry AI help cut our energy use? (2025, March 24)
retrieved 24 March 2025
from https://techxplore.com/news/2025-03-energy-hungry-ai.html

This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The content is provided for information purposes only.