The tricky task of calculating AI’s energy use
Making models less thirsty may not lessen their environmental impact
April 9th 2025
A fifth of all electricity used in Ireland is spent powering the country’s data centres, more than is used by its urban homes. With one data centre for every 42,000-odd people, Ireland has one of the highest per-person concentrations of computing power in the world. Loudoun County, just outside Washington, DC, beats it: its 443,000 residents rub shoulders with scores of data centres—more than the next six biggest clusters in America combined. In 2022 their peak energy usage was almost 3 gigawatts (GW), a power draw that, if maintained year round, would approach Ireland’s total annual consumption.
Around 1.5% of global electricity is spent on powering data centres. Most of that is for storing and processing data for everything from streaming video to financial transactions. But artificial intelligence (AI) will make up much of future data-centre demand. By 2038 Dominion, a power company, expects the data centres in Loudoun County alone to need more than 13GW. The International Energy Agency, a forecaster, estimates that global data-centre power demand could increase by between 128% and 203% by 2030, mostly because of AI-related energy consumption.
Big tech is confident that the environmental benefits justify the costs. “AI is going to be one of the main drivers of solutions to the climate situation,” says Demis Hassabis, the boss of Google DeepMind. Others disagree. This week’s special section explores the arguments in detail. It examines the ways in which AI can help clean up some of the most polluting industries, including energy production and heavy industry, and discusses the possibility of moving data centres off Earth altogether. It will also examine why AI’s energy footprint is so hard to quantify, and what its true environmental impact might be.
Tech firms are generally unwilling to share information about their AI models. One indirect way to estimate the environmental impact of building and deploying AI models, therefore, is to look at the firms’ self-reported carbon emissions. Google’s greenhouse-gas emissions rose by almost half between 2019 and 2023, according to the search giant, primarily because of increases in the energy consumption of data centres and supply-chain emissions. Microsoft’s emissions jumped by roughly a third in 2023, compared with three years earlier, partly due to its own focus on AI.
None
Another approach to estimating AI’s environmental footprint is to add up the energy use of the infrastructure used to build the models themselves. Meta’s Llama 3.1, a large language model (LLM), for example, was trained using chips from Nvidia which can draw 700 watts of power each, around half that of a fancy kettle, and it ran those chips for a cumulative 39.3m hours. The resulting energy used, 27.5 gigawatt-hours (GWh), is enough to supply 7,500 homes with a year’s worth of power.
Tech companies, perhaps unsurprisingly, are keen to argue that this energy bill is not nearly as outlandish as it might appear. The immediate climate impact of the final Llama 3.3 training run, Meta estimates, is emissions worth 11,390 tonnes of CO2—about the same as 60 fully loaded return flights between London and New York. Those are the emissions, at least, of the power grid that supplied the company’s data centre. But Meta argues that, since electrons are fungible, if enough renewable energy is bought on the opposite side of the country—or even at another time altogether—the true emissions fall to zero.
Making models less thirsty may not lessen their environmental impact
April 9th 2025
A fifth of all electricity used in Ireland is spent powering the country’s data centres, more than is used by its urban homes. With one data centre for every 42,000-odd people, Ireland has one of the highest per-person concentrations of computing power in the world. Loudoun County, just outside Washington, DC, beats it: its 443,000 residents rub shoulders with scores of data centres—more than the next six biggest clusters in America combined. In 2022 their peak energy usage was almost 3 gigawatts (GW), a power draw that, if maintained year round, would approach Ireland’s total annual consumption.
Around 1.5% of global electricity is spent on powering data centres. Most of that is for storing and processing data for everything from streaming video to financial transactions. But artificial intelligence (AI) will make up much of future data-centre demand. By 2038 Dominion, a power company, expects the data centres in Loudoun County alone to need more than 13GW. The International Energy Agency, a forecaster, estimates that global data-centre power demand could increase by between 128% and 203% by 2030, mostly because of AI-related energy consumption.
Big tech is confident that the environmental benefits justify the costs. “AI is going to be one of the main drivers of solutions to the climate situation,” says Demis Hassabis, the boss of Google DeepMind. Others disagree. This week’s special section explores the arguments in detail. It examines the ways in which AI can help clean up some of the most polluting industries, including energy production and heavy industry, and discusses the possibility of moving data centres off Earth altogether. It will also examine why AI’s energy footprint is so hard to quantify, and what its true environmental impact might be.
Tech firms are generally unwilling to share information about their AI models. One indirect way to estimate the environmental impact of building and deploying AI models, therefore, is to look at the firms’ self-reported carbon emissions. Google’s greenhouse-gas emissions rose by almost half between 2019 and 2023, according to the search giant, primarily because of increases in the energy consumption of data centres and supply-chain emissions. Microsoft’s emissions jumped by roughly a third in 2023, compared with three years earlier, partly due to its own focus on AI.
None
Another approach to estimating AI’s environmental footprint is to add up the energy use of the infrastructure used to build the models themselves. Meta’s Llama 3.1, a large language model (LLM), for example, was trained using chips from Nvidia which can draw 700 watts of power each, around half that of a fancy kettle, and it ran those chips for a cumulative 39.3m hours. The resulting energy used, 27.5 gigawatt-hours (GWh), is enough to supply 7,500 homes with a year’s worth of power.
Tech companies, perhaps unsurprisingly, are keen to argue that this energy bill is not nearly as outlandish as it might appear. The immediate climate impact of the final Llama 3.3 training run, Meta estimates, is emissions worth 11,390 tonnes of CO2—about the same as 60 fully loaded return flights between London and New York. Those are the emissions, at least, of the power grid that supplied the company’s data centre. But Meta argues that, since electrons are fungible, if enough renewable energy is bought on the opposite side of the country—or even at another time altogether—the true emissions fall to zero.