Article

The Uncomfortable Math Behind AI's Energy Appetite

6 min read

The Uncomfortable Math Behind AI's Energy Appetite

If you've been following the AI industry lately, you've probably seen the alarming headlines: data centers consuming as much electricity as small countries, power grids straining under unprecedented demand, and tech companies quietly restarting mothballed power plants. The energy cost of AI is real, and it's growing fast. But some in the industry have started pushing back on the narrative — and their arguments, while self-serving, aren't entirely wrong.

The Deflection Playbook

A recurring talking point from AI leaders goes something like this: "Sure, AI uses a lot of energy, but so does everything else humans do." The argument typically points to the massive energy costs of existing industries — transportation, agriculture, heating and cooling buildings, even the metabolic energy required to keep 8 billion humans alive and working.

The implication is clear: if we're going to worry about AI's energy footprint, we should put it in the context of everything else society already consumes. A single Google search uses roughly 0.3 watt-hours of electricity. A ChatGPT query might use ten times that. But a round-trip flight from New York to London burns through enough fuel to power thousands of AI queries. So what's the big deal?

Why the Comparison Doesn't Quite Work

On the surface, this framing sounds reasonable. But it has a fundamental flaw: it compares a mature, essential system to one that's growing exponentially from a standing start.

Humanity's existing energy consumption — for food, shelter, transportation — has been built up over centuries, and most of it serves basic needs. AI energy consumption, by contrast, has gone from negligible to significant in just a few years, and a meaningful chunk of it powers features that are, at best, conveniences. When a technology goes from consuming almost nothing to requiring dedicated nuclear power plants in under a decade, it's worth asking hard questions regardless of what other things also use energy.

The comparison also glosses over the marginal impact problem. The energy grid is a shared resource with finite capacity. Every megawatt that goes to a new data center is a megawatt that isn't available for something else — or that requires new generation capacity to be built. In parts of the United States, data center demand is already delaying the retirement of coal plants and straining local grids. That's not a hypothetical concern; it's happening right now.

The Numbers Are Staggering — And Growing

The International Energy Agency estimates that global data center electricity consumption could double by 2028, driven largely by AI workloads. Some estimates put AI-related energy demand even higher. Training a single large language model can consume as much electricity as dozens of American households use in a year. And that's just training — inference (actually running the model to answer queries) is where the real ongoing costs pile up as adoption scales.

Major tech companies have been on a building spree. Microsoft, Google, Amazon, and Meta have collectively committed hundreds of billions of dollars to data center infrastructure. Several have signed deals to build or restart nuclear power facilities specifically to feed their AI ambitions. When the industry's own solution to its energy problem is "build more nuclear reactors," you know the scale of demand is extraordinary.

The Productivity Argument

The more sophisticated version of the industry's defense doesn't just compare AI to existing energy use — it argues that AI will replace much of it. If an AI agent can do in minutes what takes a human worker hours, the total energy equation might actually favor AI, even accounting for the electricity it consumes. A human worker commutes to an office (energy), sits in a climate-controlled building (energy), eats lunch that was farmed, processed, and transported (energy), and produces output at human speed. An AI does it faster, without the commute.

This argument has some theoretical merit, but it assumes AI is directly substituting for human activity rather than supplementing it — and so far, the evidence points more toward the latter. Most companies aren't firing workers and replacing them with AI; they're using AI to do more with the same workforce. That means AI energy consumption is largely additive, not substitutive. We're using more total energy, not less.

What an Honest Conversation Looks Like

None of this means AI isn't worth the energy it consumes. Many technologies that transformed society — electrification, the internet, air conditioning — came with significant energy costs that we collectively decided were worth paying. AI may well belong in that category.

But an honest conversation about AI's energy footprint requires a few things the industry hasn't been great at providing:

Transparency. Most AI companies are vague about the actual energy consumption of their models and services. Publishing detailed energy and carbon metrics should be standard practice, not a competitive secret.

Proportionality. Comparing AI's energy use to human metabolism or intercontinental flights is a rhetorical trick, not analysis. The relevant comparison is: what is the marginal energy cost of AI, what value does it create, and are there ways to deliver that value more efficiently?

Accountability. If AI companies are going to consume an outsized share of new energy capacity, they should bear responsibility for ensuring that capacity is clean. Signing power purchase agreements for renewable energy is a start, but it's not enough if the net effect is keeping fossil fuel plants running longer.

The Bottom Line

The energy cost of AI is neither the catastrophe that critics suggest nor the rounding error that boosters claim. It's a serious and rapidly growing demand on shared infrastructure that deserves serious scrutiny — not deflection through misleading comparisons.

The tech industry has a long history of externalizing costs while privatizing benefits. The energy debate is shaping up to be the next chapter in that story. Whether it ends differently depends on whether the public and policymakers demand better answers than "but humans use energy too."