Sam Altman's Energy Math Doesn't Add Up (And That's a Problem)

Sam Altman's Energy Math Doesn't Add Up (And That's a Problem)

I’ve been watching OpenAI’s Sam Altman navigate the environmental impact conversation for a while now, and his recent comments at The Indian Express event are a masterclass in misdirection. Not because he’s lying, but because he’s framing the debate in ways that conveniently sidestep the actual concerns.

Let’s start with the water thing. Altman calls concerns about AI water usage “totally fake” and dismisses the viral claims about ChatGPT using 17 gallons per query. He’s probably right that the specific numbers floating around online are exaggerated or outdated. Data centers have moved away from evaporative cooling in many cases.

But here’s what bugs me. When you’re the CEO of one of the most valuable artificial intelligence companies in the world, saying something is “completely untrue, totally insane, no connection to reality” without providing your own transparency feels defensive rather than clarifying. If the numbers are wrong, show us the real ones. OpenAI has the data. They just don’t share it.

The Human Evolution Comparison Is Nonsense

The part where Altman really loses me is this comparison between training AI models and training humans. He argues that it’s unfair to count training costs for AI without counting the 20 years of food and energy it takes to educate a human, plus the entire evolutionary history of humanity.

This is intellectually dishonest in ways that frustrate me as someone who actually thinks about systems.

First, humans are already here. We’re eating food regardless of whether we become knowledge workers or not. The marginal cost of training a person to answer questions isn’t 20 years of energy consumption, it’s the delta between what they’d consume anyway and what additional resources go into their education. Most of which is infrastructure we already built.

Second, and more importantly, we’re not replacing one human with one AI model. We’re talking about massive compute clusters running 24/7 to serve millions or billions of queries. The scale is completely different. When you train GPT-5 or whatever comes next, you’re consuming enough energy to power small cities for months. That’s not comparable to sending a kid to school.

The evolution argument is even wilder. Yes, it took billions of years and countless human lives to develop our collective knowledge. But that’s a sunk cost of existence itself. We can’t choose not to have evolution. We can choose whether to build another massive AI training cluster.

The Real Question Nobody Wants to Answer

What Altman does acknowledge is that total energy consumption for AI is legitimately concerning. He’s right that we need more nuclear, wind, and solar. Great. But that’s a deflection from the immediate question: is this particular use of energy worth it right now, given our current grid constraints and climate targets?

I’m not anti-AI. I build with these tools every day. But the “per query” efficiency argument only works if you ignore scale. Sure, maybe one ChatGPT query is more efficient than having a human research and write an answer. But humans don’t process billions of queries a day from a standing start.

The issue isn’t whether AI is theoretically more efficient than humans at certain tasks. It’s whether we’re adding massive new energy demands to already strained grids, in regions where that energy still comes from fossil fuels, faster than we can build clean alternatives.

And without transparency on actual consumption numbers, we can’t even have an informed debate about whether the tradeoff is worth it. Scientists are trying to study this independently precisely because companies like OpenAI won’t publish their own figures.

Energy Efficiency vs Energy Scale

There’s a concept in environmental economics called the Jevons paradox. Make something more efficient, and often you end up using more of it total because it becomes cheaper and more accessible. Even if per-query efficiency improves, the explosion in AI usage means total consumption could still skyrocket.

Altman’s framing tries to make this about efficiency per query, but that’s only half the equation. The other half is how many queries we’re running, how many models we’re training, and how fast this is all growing. When data centers are being connected to rising electricity prices and scientists are scrambling to independently measure impact because companies won’t share data, maybe the dismissive tone isn’t the right approach.

I want AI to succeed. I want these tools to be sustainable. But we’re not going to get there by comparing model training to human evolution or calling legitimate environmental concerns “totally insane” without backing it up with actual transparency about resource usage.

Read Next