AI may soon surpass Bitcoin mining as a source of energy, according to new analysis. The study concludes that artificial intelligence will use half of the electricity consumed in data centers worldwide by 2025. The estimates are from Alex de Vries Gao, a PhD student at Vrije Universiteit Amsterdam Institute for Environmental Studies. He has tracked cryptocurrencies electricity consumption and environmental impacts in previous research as well as on his website Digiconomist (19459012). He published last week his latest commentary about AI’s increasing electricity demand in the journal Joule.
AI accounts for up to one fifth of the electricity used by data centers, according to de Vries Gao. It’s difficult to estimate the energy consumption of AI models without data from big tech companies. De Vries Gao had to make estimates based on supply chains for specialized computer chip used for AI. He and other researchers who are trying to understand AI energy consumption have found that it is growing despite gains in efficiency — and at such a rapid pace to warrant closer scrutiny.
De Vries-Gao said he thought he was going to retire when he saw that alternative cryptocurrencies, such as Ethereum, were moving to less-energy-intensive technologies. He tells the Vergethat “ChatGPT” happened. “I was like Oh Boy, here we go” This is a technology that is usually energy-intensive, especially in highly competitive markets.
He sees some key parallels. He says that the first is the mindset of “bigger, better”. “We see big tech [companies] constantly increasing the size of their model, trying to create the best model available, but at the same time, they are also increasing the resource demands for those models,” he explains. This has led to an explosion of new data centers, especially in the US where there are more than any other country. Energy companies are planning to build new nuclear reactors and gas-fired plants to meet the growing demand for electricity from AI. Sudden spikes of electricity demand can strain power grids, and derail efforts to switch from dirty sources of energy. This is a problem that is also posed by the new crypto mines which are essentially data centers used to verify blockchain transactions.
De Vries-Gao’s previous work on crypto mining shows how difficult it can be to determine how much energy is actually used and the environmental impact of these technologies. Many major tech companies that are developing AI tools have set goals for climate change and included their greenhouse gas emissions into annual sustainability reports. It’s from this that we know Google and Microsoft’s carbon emissions have increased in recent years, as they’ve focused on AI. Companies don’t usually break down data to show which factors are attributable to AI.
De Vries-Gao used a technique he calls “triangulation” to figure this out. He used publicly available device details and analyst estimates as well as earnings calls from companies to estimate the hardware production for AI, and how much energy it will likely consume. Taiwan Semiconductor Manufacturing Company, which manufactures AI chips for companies such as Nvidia and AMD and other semiconductor companies, saw its production of packaged chipsfor AI more than twice between 2023-2024.
After de Vries Gao calculated how much specialized AI equipment could be produced, he compared this to information about the amount of electricity these devices consumed. He found that they consumed as much electricity last year as de Vries Gao’s native Netherlands. He believes that by 2025, the power demand for AI will be 23GW. This is close to the size of the UK.
A separate report by consulting firm ICF last week predicted a 25 percent increase in electricity demand in US by the end the decade, thanks in part to AI and traditional data centers.
It is still difficult to make blanket predictions regarding AI’s energy use and its environmental impact. This was made clear in a well-reported article published last week in MIT Technology Review ( ) with support from the Tarbell Center for AI Journalism. As an example, a person using AI to promote a fundraising event could create twice as much pollution if the queries are answered by data centres in West Virginia rather than California. Energy intensity and emissions are affected by a variety of factors, including the type of queries made, size of the models that answer those queries, and share of fossil fuels and renewables in the local power grid. It is a mystery which could be solved by tech companies being more transparent.
A mystery which could be resolved by tech companies being more transparent about AI when reporting sustainability. “The insane amount of steps you have to take to be able put a number on this, this is really absurd,” de Vries Gao says. It shouldn’t be so ridiculously difficult. But unfortunately, it is.
Looking into the future, even more uncertainty exists when it comes to whether or not energy efficiency gains will ultimately flatten out electricity consumption. DeepSeek made waves earlier this year by claiming that its AI model would use a fraction the amount of electricity as Meta’s Llama 3.0 model. This raised questions about whether tech firms really needed to be such energy hogs to make AI advances. The question is if they will prioritize building more efficient AI models and abandoning the “bigger’s better” approach that simply throws more data and computing to their AI ambitions.
When Ethereum switched to a much more energy-efficient strategy for validating transaction than Bitcoin mining, it’s electricity consumption dropped by 99.988 per cent. Environmentalists have urged other blockchain networks to adopt the same strategy. Others, namely Bitcoin miners, are reluctant to abandon the investments they have already made in their existing hardware (or give up other ideologies for sticking with old practices ).
Another risk is the Jevons paradox, whereby more efficient models consume more electricity as people use the technology. It’ll be difficult to manage the problem without first measuring it.