This report tried to estimate AI energy usage, which is a fiendishly difficult task.

One person with a serious AI addiction may consume enough electricity every day to run a microwave for more than three full hours. The actual cost may be even higher, since so many companies keep their AI models a secret. The MIT Technology Review spoke to two dozen researchers who track AI energy consumption and conduct experiments on their own. But that didn’t deter them.

Working together with researchers from Hugging face, the authors of this report determined that energy used to generate a single response to the open-source Llama 3.0 8B engine consumed around 57 joules. The 8B indicates that the engine has 8 billion parameters. The report stated that this number should be doubled to account for cooling and energy demands. This would bring a single query for that model to 114 joules, which is equivalent to running a microwavable for a tenth second. A larger model like Llama 3.0 405B requires around 6,706 J/response – equivalent to eight seconds of microwave usage.

The size of the model is a major factor in how much energy a model uses. OpenAI’s GPT-4, whose true size is unknown, is estimated to have over a trillion variables, which means its energy footprint per query is likely much higher than the Llama questions tested.

It is also worth noting that these figures are for text based responses. MIT TR reported that AI-generated images use less energy than text responses due to their smaller models and the fact diffusion is more efficient than inference.

AI Video generation is a huge energy drain.

To generate a five-second video at 16 frames per seconds, the CogVideoX AI model consumes a whopping amount of energy. This is equivalent to running a micro-wave for an hour, or riding 38 miles in an e-bike. Hugging Face researchers explained to the Tech Review. The report noted that

“It’s fair to say that the leading AI video generators, creating dazzling and hyperrealistic videos up to 30 seconds long, will use significantly more energy,” .

The authors used this data to estimate the daily AI energy usage of someone who relies on generative models. 15 questions, 10 attempts to generate an image, and 3 attempts to make an Instagram-ready 5-second video would consume the estimated 2.9 kWh – roughly three and a quarter hours of microwave usage. OpenAI estimates that hundreds of millions of people use ChatGPT every week

Researchers focused on open-source LLMs, about which we know a great deal. OpenAI and Google, for example, keep the size and scope of their models from the public. This makes it difficult to estimate energy usage accurately.

The Tech Review article points out that when it comes to measuring CO2 emission, the AI picture becomes even more complex. The mix of renewable and nonrenewable energy varies greatly by location and time (solar power is not used at night).

This report didn’t mention prompt cachinga technique used by generative models that stores responses and then feeds them back to users who ask the same or similar question. This can reduce energy consumption.

Dirty acts, not dirt cheap.

Ignore these caveats and one thing is certain: A lot of energy is consumed to power our growing AI habit. Not only that, a large portion of it emits carbon into the air for what could be argued to be questionable utility

  • Datacenter emissions are a dirty little secret
  • The energy companies have been told to charge up for the AI datacenter surge.
  • Is a crash in the dot com era on the cards for AI spending? It’s a risk
  • OK, great. UK is building a lot of AI datacenters. How will we power this?

The Tech Review report noted that the current spike in energy consumption for datacenters follows years of relatively flat usage due to steady workloads, and ever-increasing efficiencies. Datacenters consumed more than a quarter the electricity in Ireland by 2023. Datacenters are predicted to double their energy consumption compared to the current levels by 2030. This will surpass the energy consumption for the entire nation Japan at the beginning of the next decade. AI is, of course, the biggest driver behind this increase.

Over the years, tech companies have made a lot of noise about going green. They’ve assured the public for years that their bit barns don’t pose an environmental threat. Now that AI is in the picture, the net-zero goals of technology giants like Microsoft or Google are receding.

The Register has covered this topic a lot in recent months, and our reporting is largely in line with the MIT tech Review’s conclusion: The energy behind AI is much dirty thantech firms[19][19][19][19][19][19][19][19][19][19][][19][19][19][19][19][19][19][19][19]”_blank” [19][19][19][19][19][19][19][19][19]

Datacenters will emit 2.5 billion tons greenhouse gases by the year 2020. This is three timesmore than they would have been if generative AI had not become the latest craze.

And to add insult to injury, these numbers are based on shaky foundations, as noted in the Tech Review report. They said

“This leaves even those whose job it is to predict energy demands forced to assemble a puzzle with countless missing pieces, making it nearly impossible to plan for AI’s future impact on energy grids and emissions,” . (r)

www.aiobserver.co

More from this stream

Recomended