(Image credit: Shutterstock/Sashkin)
Headlines often highlight AI’s massive energy consumption, likening it to that of a small country, raising concerns about strain on power grids and excessive water usage for cooling.
But what is the actual environmental cost when AI performs a simple task, like preparing instant noodles instructions, instead of just reading the packaging?
Insights from industry leaders, such as Sam Altman’s blog posts, offer intriguing but sometimes unverified perspectives on this issue.
Recently, Google broke new ground by releasing data on the average energy consumption of a Gemini AI text prompt. While this data focuses solely on text generation and excludes video or image outputs, it provides a valuable benchmark for understanding AI’s power demands.
The key figure to note is 0.24 watt-hours (Wh) per prompt, which helps contextualize AI’s energy use against everyday electricity consumption.
So, is AI’s electricity appetite truly devastating the environment, or do other common activities, like your nightly Netflix session, pose a bigger threat?
Understanding the Energy Footprint of a Single AI Prompt
To grasp the scale, consider how much power a single AI prompt consumes compared to other household devices. The 0.24 Wh figure represents only the energy used within data centers, excluding the electricity consumed by your personal device.
One AI prompt uses roughly 1.5% of the energy required to fully charge an iPhone 17 or less than 10 seconds of streaming on a 55-inch television.
When streaming video at home, the vast majority of electricity-about 99.97%-is consumed by the TV itself, with data centers accounting for a mere 0.03%. For laptop users, the device accounts for 99.6% of power use, and for smartphone users, it’s 98.4%.
This means that while data centers do consume energy, the bulk of electricity usage happens on the end-user’s device during activities like video streaming.
Comparing AI Prompts to Other Data Center Activities
Within data centers, AI prompts demand more power than typical streaming operations. However, unless the data center is engaged in intensive computing tasks such as cloud gaming, its per-user power consumption remains relatively low.
For example, one AI prompt’s energy use equals about 3.3 seconds of cloud gaming.
Daily AI Usage and Its Environmental Impact
On average, ChatGPT users generate between 10 and 20 prompts daily, consuming approximately 3.6 Wh of electricity-less than the power wasted by standby LEDs on household devices.
Heavy users, who might send 50 prompts per day, use about 0.15% of their total daily electricity, comparable to the energy a TV consumes while in standby mode.
While a single prompt’s energy use is minimal, the cumulative effect is significant. OpenAI processes over 2.5 billion prompts daily, with hundreds of millions of active users worldwide, turning small increments into substantial totals.
Beyond Electricity: Water Use and Carbon Emissions
The environmental footprint of AI extends beyond electricity. Google estimates that each AI prompt requires about 0.26 milliliters of water for cooling and generates roughly 0.03 grams of CO2 equivalent emissions.
To put this in perspective, the water used per prompt is akin to five drops of water, and the carbon emissions are comparable to the fizz released from a single soda bubble.
Understanding these broader impacts is crucial as AI adoption continues to grow globally.
Stay tuned for the next installment in our series, where we will explore the global environmental implications of AI prompt usage in greater detail.
If you have questions or alternative calculations, feel free to share your thoughts in the comments!
For the Skeptics
- Google’s transparency on Gemini’s energy use: How accurate are the numbers?
- Is your ChatGPT activity driving up your electricity bill?
- Rising AI demand could push US electricity prices up by 18% in the near future
For the Enthusiasts
- New research shows data centers use less water than typical leisure facilities
- AI’s energy consumption is significant, but so is its potential to combat climate change
- Innovative fanless cooling tech cuts AI workload energy use by 90%
Our Approach to AI at TechRadar
At TechRadar, AI assists us with research, fact-checking, and language editing, but every article undergoes thorough human review before publication. Occasionally, we use AI for creative tasks, such as adding dinosaurs to colleagues’ photos. For more details, visit our Future and AI page.
