Three big things we still don’t know about AI’s energy burden

Unveiling the Energy Consumption of AI: A Quest for Clarity

Earlier this year, my colleague Casey Crownhart and I embarked on a six-month investigation into the environmental and energy implications of artificial intelligence. Our primary focus was to determine a critical yet elusive metric: the amount of energy consumed by leading AI models, such as ChatGPT and Gemini, when generating a single response.

This figure remained frustratingly out of reach despite growing attention from influential institutions like the White House and the Pentagon. Projections have indicated that by 2027, AI’s electricity consumption could rival 22% of the total usage of all U.S. households, underscoring the urgency of understanding its energy footprint.

Challenges in Measuring AI’s Energy Use

The core difficulty in pinpointing this data lies in the fact that only AI companies possess the detailed energy consumption figures. Our attempts to obtain this information from major players such as Google, OpenAI, and Microsoft were unsuccessful, as none disclosed their internal metrics. Experts likened this to estimating a car’s fuel efficiency without ever driving it-relying solely on hearsay about engine size and sound.


Recent Developments in AI Energy Transparency

Following our initial report, a surprising shift occurred over the summer. In June, OpenAI’s CEO Sam Altman revealed that an average ChatGPT query consumes approximately 0.34 watt-hours of energy. Shortly after, the French AI startup Mistral provided an indirect estimate of emissions related to their models. By August, Google disclosed that Gemini’s average response requires about 0.24 watt-hours. These figures aligned closely with our earlier estimates for mid-sized AI models.

Despite this progress, the question remains: have we finally demystified AI’s energy consumption, and what implications does this have for ongoing climate impact research? To explore this, I consulted both previous and new experts in the field.

Limitations of Current Energy Metrics

Experts caution that the recently published numbers are incomplete and narrowly focused. For instance, OpenAI’s figure was shared in a blog post lacking technical detail, leaving uncertainties about which specific model was measured, the methodology used, and variability across different queries. Google’s reported median energy use per query does not account for more complex or lengthy responses that demand greater computational power.

Moreover, these statistics pertain solely to chatbot interactions, ignoring other growing AI applications such as image and video generation. Sasha Luccioni, AI and climate lead at Hugging Face, emphasizes the need for energy data across diverse AI modalities as their usage expands.

It’s important to note that the energy consumed per individual query is minimal-comparable to the electricity used by a microwave for a few seconds-meaning that individual AI use is unlikely to pose a significant climate threat on its own.

Comprehensive Energy Accounting: A Complex Necessity

To fully grasp AI’s environmental impact, it’s essential to move beyond per-query energy estimates and consider the broader context of AI deployment. This includes understanding how AI is integrated into various industries and applications. Ketan Joshi, a climate and energy analyst, points out that while such detailed data is rarely available in other sectors, the unprecedented growth rate of data centers justifies heightened scrutiny of AI’s energy consumption.

Reconciling AI’s Energy Growth with Sustainability Goals

Major technology companies investing billions in AI face the challenge of balancing soaring energy demands with their commitments to sustainability. Microsoft, for example, reported a 23% increase in emissions since 2020, largely attributed to AI operations, despite pledging to achieve carbon negativity by 2030. The company acknowledged that this goal represents a long-term endeavor rather than a quick fix.

Proponents argue that AI could eventually drive environmental benefits by optimizing energy systems or accelerating the discovery of sustainable materials. However, tangible evidence of AI delivering such efficiencies remains limited. While some initiatives have used AI to detect methane leaks, transparency is lacking regarding whether these gains offset the rising emissions from expanding AI infrastructure. Meanwhile, plans for additional data centers continue, signaling ongoing growth in AI’s energy footprint.

Will AI’s Energy Demand Sustain or Plateau?

A critical uncertainty is whether AI adoption will reach the levels anticipated by industry forecasts. OpenAI reports handling 2.5 billion prompts daily, and if this trend continues, AI’s electricity consumption could match over one-fifth of U.S. household usage by 2028, according to Lawrence Berkeley National Laboratory projections.

However, recent developments suggest a potential slowdown. The underwhelming reception of OpenAI’s GPT-5 launch and findings that 95% of businesses see no return on their AI investments have dampened enthusiasm. This raises concerns that the rapid expansion of AI-specific data centers might represent an overinvestment, especially as AI companies struggle to generate consistent revenue.

Ultimately, the most significant unknown is not the energy cost of individual AI queries but whether demand will sustain the current growth trajectory or falter under inflated expectations. This outcome will determine if today’s AI infrastructure marks a permanent shift in energy consumption or a temporary spike.


More from this stream

Recomended