
Artificial intelligence (AI) will most likely account for nearly half of data center power consumption by the end of this year, an expert has estimated.
In his paper published in the sustainable energy journal Joule, founder of the Digiconomist tech sustainability website, Alex de Vries-Gao said his calculations were based on the power consumed by chips made by Nvidia and advanced micro devices used to train and operate AI models.
Besides, he took into account the energy consumption of chips used by other companies such as Broadcom.
By the end of 2025, de Vries-Gao said, energy consumption by AI systems could approach up to 49% of total data center power consumption. Moreover, AI consumption alone could reach 23 gigawatts – twice the total energy consumption of the Netherlands.
A number of variables came into the calculations. What mattered to de Vries-Gao was the energy efficiency of a data center and electricity consumption related to cooling systems for servers handling an AI system’s busy workloads.
Data centres are, of course, the central nervous system of AI technology, with their high energy demands making sustainability a key concern in the development and use of AI systems.
These estimates come as the International Energy Agency recently said that AI would require almost as much energy by the end of this decade as Japan uses today, and that, worryingly, only about half of the demand is likely to be met from renewable sources.
There is, though, a possibility of a slowdown in hardware demand, de Vries-Gao said. Perhaps the demand for applications such as ChatGPT will subside, or export controls will constrain the production of AI hardware.
China’s access to premium chips is limited, for instance, but in January, Chinese startup DeepSeek released its R1 AI model that uses lower-cost chips and less data. In other words, innovation – voluntary or forced – might reduce the energy costs of AI.
China’s access to premium chips is limited, for instance, but in January, Chinese startup DeepSeek released its R1 AI model that uses lower-cost chips and less data. In other words, innovation – voluntary or forced – might reduce the energy costs of AI.
However, according to de Vries-Gao, any efficiency gains could encourage even more AI use. Besides, individual countries attempting to build their own AI systems – a trend known as “sovereign AI” – could also increase hardware demand.
What’s more, the Financial Times said last year that US tech companies were already consuming record quantities of water in their data centers.
In the US, data centers consumed more than 75 billion gallons of water in 2023 – that’s as much as London consumes in four months. Environmental activists warn that this is unsustainable.
Your email address will not be published. Required fields are markedmarked