With the seemingly limitless demand for artificial intelligence in the energy, AI and climate fields, everyone is rightfully concerned. Will there be enough electricity to power AI, and enough water to cool data centers to support this technology? These are serious questions that have implications for communities, economies, and the environment.
However, the question of AI’s energy consumption portends bigger issues in terms of what we will need to do for the next few decades to address climate change. If we don’t figure out how to deal with this, we will not be able handle a broader electrification, and climate risks will increase.
IT innovation got us here. The graphics processing units (GPUs), which power AI computing, have been upgraded.
Costs have dropped by 99% in the last six years. In the early 2010s there was a similar concern regarding the energy consumption of data centers, with wild predictions of an increase in electricity demand. But gains in computing and energy efficiency proved these projections incorrect and enabled a 550% rise in global computing capabilities from 2010 to 2018, with only minimal increases of energy use.
By the late 2010s however, the trends which had saved us started to break. As AI models improved dramatically, the electricity required for data centers began to increase faster. They now account for 4,4% of the total demand, up 1.9% from 2018. In six US states, data centers consume more electricity than 10%. Virginia, a state that has become a hub for data centers, has surpassed this figure by 25%.
The future demand for energy is uncertain and varies widely. However, in a study by Lawrence Berkeley National Laboratory, it was estimated that data centres could account for 6% to 14% of the total US electricity consumption by 2028. This rapid increase in electricity demand will be noticed by communities and companies. This will put pressure both on energy prices as well as on ecosystems. The projections have led to calls for building new fossil-fired plants or bringing older ones back into service. In many parts in the US, there will be a surge in natural-gas powered plants.
This is a very daunting situation. When we zoom out, however, the projected electricity consumption from AI is still quite small. The US generated approximately 4,300 billion kilowatt hours last year. We will likely need 1,000 billion to 1,200 trillion or more in the coming decade – a 24% to 29.9% increase. Nearly half of the additional electricity demandis expected to come from electrified cars. Another 30% of the additional electricity demand is expected to come from electrified technologies for buildings and industries. This shift in technology will benefit the climate, communities and energy costs.
According to estimates, the remaining 22% of electricity demand will come from AI and data centres. It’s the most important piece of the pie. Data centers, with their rapid growth and geographic concentrationare the electrification challenges we face today. They’re the small stuff that we need to figure out first before we can do the big stuff such as vehicles and buildings.
It is also important to understand the benefits of AI in terms of energy consumption and carbon emission. The impacts of producing semiconductors and powering AI-based data centers are important but likely small in comparison to the positive or negative AI could have on applications like the electricity grid, transportation system, factories and buildings, or consumer behaviour. Companies could use AI for the development of new materials or batteries to better integrate renewable energy in the grid. AI could also be used to find more fossils fuels. The claims of potential benefits for climate change are exciting, but will need to be continually verified and supported to be realized.
It’s not the first time that we’ve had to deal with the growth of electricity demand. In the 1960s US electricity demand grew at a rate of more than 7% annually. In the 1970s, the growth rate was almost 5%. In the 1980s and 90s, it was over 2%. After 2005, electricity growth was basically flat for a decade-and-a-half. The majority of projections place our expected growth in the electricity demand for the next decade at around 2%–but we’ll need to do things differently.
In order to manage these new energy needs, we need a “Grid New Deal” that leverages both public and private capital in order to rebuild the electric system for AI and provide it with enough intelligence and capacity for decarbonization. New clean energy sources, investment in transmission, distribution, and strategies to manage virtual demand can reduce emissions, lower prices, increase resilience. Data centers that provide clean electricity or upgrade distribution systems could be given priority to connect to grid. Infrastructure banks could pay for new transmission lines, or upgrade existing ones. Direct investment or tax incentives can encourage clean computing standards, workforce training in the clean energy industry, and open data transparency by data center operators regarding their energy use, so that communities understand and measure impacts.
The White House released in 2022 a Blueprint of an AI Bill of Rights that provided principles for protecting the public’s right, opportunities, and the access to critical resources. We humbly propose a climate amendment to the AI Bill of Rights because ethical AI must also be climate-safe AI. It’s an important first step to ensure that AI growth is beneficial to everyone – that it doesn’t increase energy bills for consumers, adds more power to the grid, than it consumes, increases investment in power system infrastructure, and benefits the communities while driving innovation.
By putting the conversation about AI in the context of what is needed to combat climate change, we can achieve better outcomes for communities and ecosystems. The rise in electricity demand for AI centers and data centers will be a test of how society will respond. If we get it wrong, our chances of meeting climate targets are extremely low. We mean this when we say that data centers have a small but significant impact on energy and climate. Costa Samaras (19459020) is the Trustee professor of Civil and Environmental Engineering and Director of the Scott Institute for Energy Innovation, both at Carnegie Mellon University. Emma Strubell is the Raj Reddy assistant professor in the Language Technologies Institute at the School of Computer Science of Carnegie Mellon University.
Ramayya Krishnan is the dean of the Heinz College of Information Systems and Public Policy and William W. and Ruth F. Cooper professor of Management Science and Information Systems and the William W. and Ruth F. Cooper Chair of Management Science and Information Systems in Carnegie Mellon University.