
AI is doing the environment no favors when it comes to electricity use. Its growing appetite is now driving historic spikes in power demand.
AI is no rubber factory or coal mine – industries that people would usually associate with environmental destruction and pollution. Although the only clouds above AI companies are digital, they’re still a drain on the planet’s resources, and companies are looking for solutions.
Here is one – US energy company Talen Energy has made an impressive deal with Amazon to supply up to 1,920 megawatts of electricity from its Susquehanna nuclear plant to Amazon Web Services data centers until 2042.
It’s a win-win deal as it secures steady revenue for Talen Energy and helps Amazon meet its carbon-free energy goals. Both companies plan to explore building Small Modular Reactors and expand the plant's output.
This is important because electricity use in the US is growing for the first time in 20 years due to the growth of data centers and AI. Big tech companies want reliable and clean energy.
Take a look at Constellation Energy. Earlier this month, it agreed to keep a nuclear reactor in Illinois running for 20 more years to support Meta.
The same goes for Europe. The EU is preparing a package of measures to improve data centres' energy savings. Data centres account for 3% of EU electricity demand, and their consumption is expected to increase rapidly this decade due to the expansion of artificial intelligence.
Can AI companies be carbon-neutral? And what about water?
Put the mug down, you’ll need something much smaller to measure the water needed to fuel one average Chat GPT query.
OpenAI CEO Sam Altman did the math. In a blog post, he says an average ChatGPT query uses about 0.000085 gallons of water, or “roughly one-fifteenth of a teaspoon.”

We tried doing the math, too. While OpenAI hasn’t revealed an exact number of the daily queries ChatGPT gets, Sam Altman himself has said that “users send over 1 billion messages per day to ChatGPT.” One billion x 0.000085 equals 85,000 gallons per day. That’s 1,063 bathtubs.
“People are often curious about how much energy a ChatGPT query uses. The average query uses about 0.34 watt-hours, about what an oven would use in a little over one second, or a high-efficiency lightbulb would use in a couple of minutes,” Altman wrote.
He also argued that “the cost of intelligence should eventually converge to near the cost of electricity.”
As reported before, AI will be responsible for nearly half of data center power consumption by the end of this year.
Your email address will not be published. Required fields are markedmarked