hero:Tech: kW/h is not the Question We Should Be Asking

Tech: kW/h is not the Question We Should Be Asking


I’ve been avoiding writing about AI energy use. Honestly, it’s because I don’t think “How much energy does AI use?” is even the right question. After thinking this early on1, I realized this isn’t just about the compute, the water, or the energy used by AI models or their training. It’s about the total energy and environmental cost of computing – think network towers, file storage, the massive cloud infrastructure run by hyperscalers, and our entire, ever-expanding digital footprint.

Then, last week, I saw a LinkedIn post by my colleague, Ilmari 2. He performed a neat calculation of his personal AI energy use, which I instantly wanted to share with my family and friends who have been talking on the subject. It’s what prompted me to comment on this topic finally. I appreciate how he considered his AI energy. Still, it reinforces my point: everyone is laser-focused on AI’s energy, and we’re missing the forest for the trees. We’re forgetting where the real energy and environmental costs of our digital lives are piling up.

The AI Energy Hype is just click-baiting

As I see newspapers or blogs screaming about AI energy consumption, it creates a sense of déjà vu. The same happens with most new, complex technologies. The truth is that the energy demands of digital devices vary widely. It’s not just about training an AI model or formatting an image on the server side; it’s about how we use these things: Streaming a Netflix movie on the subway, constantly switching between network towers? That will require a substantial amount of energy, regardless of the image resolution or packaging algorithms.

Yes, if I’m performing heavy-duty AI computations or constantly updating large AI databases, the energy cost will be high. But for an individual – even accounting for a share of training costs (which are large, so what?) – the energy spent on browsing Instagram, YouTube, or TikTok is likely far heavier than what a normal person can generate with AI.

While the aggregate energy use of AI is a growing concern, and rightly so, an individual’s AI queries are rarely the primary environmental concern.

Our Entire Digital Footprint is Thirsty

So, if AI isn’t the boss monster, where should we look? The real issue is the environmental price tag of all compute, encompassing everything from the energy guzzled by network towers and the constant power draw required for file storage to the sprawling data centers and the often opaque operational choices made by the hyperscale companies that run them.

Our digital footprint relentlessly expands, consuming more resources every year, yet these – true – costs are frequently obscured from view.

At the heart of this are the data centers – the unseen giants forming the backbone of our online world. Their electricity consumption is already staggering, accounting for approximately 1.5% of global electricity in 2024. 3 And the hunger doesn’t stop at electricity. There is a significant “embodied carbon” footprint associated with manufacturing the hardware – the chips and servers crafted from mined materials – and an additional environmental burden from dismantling the outdated equipment. Never mind water. Cooling these digital behemoths is an incredibly water-intensive operation; training GPT-3 alone is estimated to have required 700,000 liters of water 4.

The accountability for this vast infrastructure largely falls on a handful of cloud giants – AWS, Azure, GCP – who hold immense power over the digital landscape. Yet, their commitment to true sustainability presents a very mixed bag. Transparency, a key principle, is often lacking. While some, like Google, offer relatively open metrics and detail their progress – for instance, in their 2024 Environmental Report, they stated that for 2023, ten of their grid regions achieved at least 90% carbon-free energy (CFE), and they maintained a global average of 64% CFE across their data centers and offices5 – others, notably AWS, have faced considerable criticism for a lack of transparency regarding AWS-specific environmental impacts or their full Scope 3 emissions. From my perspective, companies like OpenAI and Netflix don’t seem to prioritize this crucial transparency nearly enough6 7.

The Path Forward?

My point isn’t to stop progress or to suggest, “Don’t Look Up.” It’s to shift the conversation from fear and hype to informed, constructive action. The real challenge, and the opportunity, is to build a digital world where environmental responsibility is a fundamental, non-negotiable design principle for everyone – from hyperscalers to app developers.

As consumers and professionals, we need clear and understandable information to make genuinely greener choices. It’s time for the industry to step up with transparent, verifiable data and to design for sustainability by default, not as an afterthought. Let’s stop generating fear out of things that are complex or hard to measure and start demanding the tools and information to make a real difference.

Footnotes

  1. Tech: The high cost of computation

  2. Ilmari’s post on LinkedIn

  3. IEA (2025), Energy and AI, IEA, Paris. More information is available at: https://www.iea.org/reports/energy-and-ai/executive-summary .

  4. Deepgram (2025), How AI Consumes Water: The unspoken environmental footprint. More information is available at: https://deepgram.com/learn/how-ai-consumes-water (Based on data from sources like Li, Peng, et al. “Making AI Less “Thirsty”: Uncovering and Addressing the Secret Water Footprint of AI Models.”)

  5. Google (2024), 2024 Environmental Report. More information is available at: https://sustainability.google/reports/google-2024-environmental-report/

  6. Renewable Energy Certificates (RECs) can feel like a shell game. Touting 100% renewable energy based on RECs doesn’t always mean the power used at a specific facility at a particular moment is green. Without being tied to genuine, additional renewable capacity, it risks being little more than greenwashing.

  7. Looming over all of this is the Jevons Paradox. This economic theory suggests that as the efficiency of technology increases, its use often increases, sometimes dramatically. Efficiency, in isolation, doesn’t automatically guarantee a smaller footprint.