Understanding the Energy Footprint of AI: A Closer Look at Gemini’s Consumption
Google recently revealed that an average query made through its Gemini AI app consumes roughly 0.24 watt-hours of electricity-comparable to the energy used by a microwave running for just one second. At first glance, this figure appears minimal, especially considering how often we use household appliances like microwaves for longer durations daily.
Why This Single Metric Doesn’t Tell the Whole Story
While Google’s disclosure is a welcome step toward transparency in AI energy usage, it’s important not to jump to the conclusion that AI’s environmental impact is negligible. Several factors complicate this seemingly small number.
1. Limited Scope: Text Queries Only
The reported energy consumption focuses exclusively on text-based queries. However, AI tasks involving image and video generation typically demand significantly more power. For instance, creating a single AI-generated video can consume several times the energy of a text query due to the complexity of processing and rendering visual data.
Google’s chief scientist, Jeff Dean, mentioned that while the company currently prioritizes text prompts-given their widespread daily use-they have not yet extended this analysis to multimedia queries. Yet, with the surge of AI-generated images and videos populating social media platforms, these forms of AI interaction represent a growing and energy-intensive segment that remains unaccounted for in the current figures.
Moreover, the 0.24 watt-hour figure represents a median value, meaning many queries-especially those involving longer, more complex questions or advanced reasoning algorithms-likely consume more energy. The distribution and upper bounds of energy use per query remain unclear.
2. The Unknown Scale: Total Query Volume Remains a Mystery
Another critical missing piece is the total number of Gemini queries processed daily. Without this data, it’s impossible to estimate the overall energy footprint of the service. Despite persistent inquiries, Google has declined to disclose this information, citing concerns over revealing fluctuating and abstract metrics.
For context, OpenAI publicly shares that ChatGPT handles approximately 2.5 billion queries each day, with an average energy consumption of 0.34 watt-hours per query. This translates to an annual energy use exceeding 300 gigawatt-hours-enough to power nearly 30,000 average American households. When viewed through this lens, the cumulative impact of AI queries is far from trivial.
3. AI’s Ubiquity Beyond Chatbots
AI technologies are increasingly embedded in everyday tools, often operating behind the scenes. From AI-generated summaries in search engines to automated drafting features in email and messaging apps, many users interact with AI without explicit awareness. Google’s Gemini energy estimate does not encompass these widespread applications, making it challenging for individuals to gauge their personal AI-related energy consumption accurately.
It’s important to emphasize that using AI tools for convenience or productivity should not be stigmatized. The broader conversation should focus on systemic energy demands rather than individual usage guilt.
The Bigger Picture: AI’s Growing Demand on Energy Infrastructure
While individual query energy costs may seem small, the aggregate effect is substantial. For example, Meta is projected to require over two gigawatts of power for a single data center in Louisiana within this decade. Google Cloud alone is investing upwards of $25 billion in AI infrastructure across the US East Coast’s PJM grid.
Projections indicate that by 2028, AI could be responsible for consuming 326 terawatt-hours of electricity in the United States, generating over 100 million metric tonnes of CO₂ emissions. These figures underscore the urgent need for comprehensive transparency and sustainable practices within the AI industry.
Moving Forward: The Need for Greater Transparency and Accountability
Google’s recent disclosure marks one of the most detailed insights into AI energy consumption to date, yet it highlights how much remains unknown. As AI continues to integrate deeper into our digital lives, stakeholders must demand more extensive reporting from all major AI developers to fully understand and mitigate the environmental impact.
Only by considering the full scope of AI’s energy use-from diverse query types to total usage volumes-can we develop effective strategies to balance technological advancement with ecological responsibility.

