
AI is devouring energy at a staggering rate – raising urgent questions about its sustainability, real-world value, and human cost.
Speaking at the Advanced Micro Devices (AMD) Advancing AI 2025 Summit, Altman was asked by AMD CEO Lisa Su about the recent Chat GPT outages.
“A significant fraction of the power on Earth should be spent running AI,” said Altman.
No, that's not a sci-fi villain – it’s the CEO of OpenAI.
As AI often faces outages and hallucinations, is more power the solution?
Powering hallucinations, not help
Training GPT-4 reportedly used enough electricity to power 160 US homes for a year, and with AI systems are not only digital – they rely on energy-intensive data centers and water for cooling.
Microsoft and Google are already being criticized for their AI-driven surges in emissions and water use.
And with Advanced Micro Devices, Nvidia and others pushing for more efficient microchips, AI could potentially become as bad as crypto for energy waste.
We’re already investing massive energy to build systems designed to replace human labor, energy that depends on finite resources like electricity, water, and the physical hardware that powers AI infrastructure.
Automation eats its tail
As AI automates more jobs, displaced workers may struggle to afford rising utility costs.
The same systems that promise efficiency could worsen inequality and access to basic services.
There’s an economic paradox: AI creates value, but may eliminate the consumers needed to sustain markets.
AI isn’t buying groceries, paying rent, or stimulating local economies. Powering AI at scale could come at the cost of human resilience – both ecological and social.
In trying to optimize everything, we risk eroding the very systems that sustain us.
Your email address will not be published. Required fields are markedmarked