Home / Chroniques / Generative AI: energy consumption soars
AI chip glows with blue energy and power, futuristic processor of artificial intelligence radiates light and lightning. Concept of computer technology, circuit board, cpu, data
π Energy π Science and technology

Generative AI: energy consumption soars

Anne-Laure Ligozat
Anne-Laure Ligozat
Professor of Computer Science at ENSIIE and LISN
anonyme
Alex De Vries
PhD Student at the School of Business and Economics at the University of Amsterdam
Key takeaways
  • The energy consumption of artificial intelligence is skyrocketing with the craze for generative AI, although there is a lack of data provided by companies.
  • Interactions with AIs like ChatGPT could consume 10 times more electricity than a standard Google search, according to the International Energy Agency (IAE).
  • The increase in electricity consumption by data centres, cryptocurrencies and AI between 2022 and 2026 could be equivalent to the electricity consumption of Sweden or Germany.
  • AI’s carbon footprint is far from negligible, with scientists estimating that training the BLOOM AI model emits 10 times more greenhouse gases than a French person in a year.
  • It seems complex to reduce the energy consumption of AI, making it essential to promote moderation in the future.

Arti­fi­cial intel­li­gence (AI) has found its way into a wide range of sec­tors: med­ical, dig­i­tal, build­ings, mobil­i­ty, etc. Defined as “a computer’s abil­i­ty to auto­mate a task that would nor­mal­ly require human judge­ment1”, arti­fi­cial intel­li­gence has a cost: its large-scale deploy­ment is gen­er­at­ing grow­ing ener­gy require­ments. The IT tasks need­ed to imple­ment AI require the use of user ter­mi­nals (com­put­ers, tele­phones, etc.) and above all data cen­tres. There are cur­rent­ly more than 8,000 of these around the world, 33% of which are in the Unit­ed States, 16% in Europe and almost 10% in Chi­na, accord­ing to the Inter­na­tion­al Ener­gy Agency2 (IEA). Data cen­tres, cryp­tocur­ren­cies and arti­fi­cial intel­li­gence will account for almost 2% of glob­al elec­tric­i­ty con­sump­tion in 2022, rep­re­sent­ing elec­tric­i­ty con­sump­tion of 460 TWh. By com­par­i­son, French elec­tric­i­ty con­sump­tion stood at 445 TWh in 20233.

AI electricity consumption: a lack of data?

How much of this elec­tric­i­ty con­sump­tion is actu­al­ly ded­i­cat­ed to AI? “We don’t know exact­ly” replies Alex de Vries. “In an ide­al case, we would use the data pro­vid­ed by the com­pa­nies that use AI, in par­tic­u­lar the GAFAMs, which are respon­si­ble for a large pro­por­tion of the demand.” In 2022, Google pro­vid­ed infor­ma­tion on the sub­ject for the first time4: “The per­cent­age [of ener­gy used] for machine learn­ing has held steady over the past three years, rep­re­sent­ing less than 15% of Google’s total ener­gy con­sump­tion.” How­ev­er, in its lat­est envi­ron­men­tal report5, the com­pa­ny pro­vides no pre­cise data on arti­fi­cial intel­li­gence. Only the total elec­tric­i­ty con­sump­tion of its data cen­tres is giv­en: 24 TWh in 2023 (com­pared with 18.3 TWh in 2021).

In the absence of data pro­vid­ed by com­pa­nies, the sci­en­tif­ic com­mu­ni­ty has been try­ing to esti­mate the elec­tric­i­ty con­sump­tion of AI for sev­er­al years. In 2019, an ini­tial arti­cle6 threw a span­ner in the works: “The devel­op­ment and train­ing of new AI mod­els are cost­ly, both finan­cial­ly […] and envi­ron­men­tal­ly, due to the car­bon foot­print asso­ci­at­ed with pow­er­ing the equip­ment.” The team esti­mates that the car­bon foot­print of the total train­ing for a giv­en task of BERT, a lan­guage mod­el devel­oped by Google, is rough­ly equiv­a­lent to that of a transat­lantic flight. A few years lat­er, Google sci­en­tists believe that these esti­mates over­es­ti­mate the real car­bon foot­print by 100 to 1,000 times. For his part, Alex de Vries has cho­sen to rely on sales of AI hard­ware7. NVIDIA dom­i­nates the AI serv­er mar­ket, account­ing for 95% of sales. Based on serv­er sales and con­sump­tion, Alex de Vries pro­ject­ed elec­tric­i­ty con­sump­tion of 5.7 to 8.9 TWh in 2023, a low fig­ure com­pared with glob­al data cen­tre con­sump­tion (460 TWh).

The tasks exam­ined in our study and the aver­age amount of car­bon emis­sions they pro­duce (in g of 𝐶𝑂2𝑒𝑞) per 1,000 queries. N.B. The y‑axis is in log­a­rith­mic scale8.

The generative AI revolution

But these fig­ures could sky­rock­et. Alex de Vries esti­mates that by 2027, if pro­duc­tion capac­i­ty match­es the com­pa­nies’ promis­es, NVIDIA servers ded­i­cat­ed to AI could con­sume 85 to 134 TWh of elec­tric­i­ty every year. The cause: the surge in the use of gen­er­a­tive AI. Chat­G­PT, Bing Chat, Dall‑E, etc. These types of arti­fi­cial intel­li­gence, which gen­er­ate text, images or even con­ver­sa­tions, have spread across the sec­tor at record speed. How­ev­er, this type of AI requires a lot of com­put­ing resources and there­fore con­sumes a lot of elec­tric­i­ty. Accord­ing to the AIE, inter­ac­tions with AIs such as Chat­G­PT could con­sume 10 times more elec­tric­i­ty than a stan­dard Google search. If all Google search­es – 9 bil­lion every day – were based on Chat­G­PT, an addi­tion­al 10 TWh of elec­tric­i­ty would be con­sumed every year. Alex De Vries esti­mates the increase at 29.3 TWh per year, as much as Ireland’s elec­tric­i­ty con­sump­tion. “The steady rise in ener­gy con­sump­tion, and there­fore in the car­bon foot­print of arti­fi­cial intel­li­gence, is a well-known phe­nom­e­non,” com­ments Anne-Lau­re Ligozat. “AI mod­els are becom­ing increas­ing­ly com­plex: the more para­me­ters they include, the longer the equip­ment runs. And as machines become more and more pow­er­ful, this leads to increas­ing­ly com­plex mod­els…”. For its part, the Inter­na­tion­al Ener­gy Agency esti­mates that in 2026, the increase in elec­tric­i­ty con­sump­tion by data cen­tres, cryp­tocur­ren­cies and AI could amount to between 160 and 590 TWh com­pared with 2022. This is equiv­a­lent to the elec­tric­i­ty con­sump­tion of Swe­den (low esti­mate) or Ger­many (high estimate).

Esti­mat­ed elec­tric­i­ty demand for tra­di­tion­al data cen­ters, AI-ded­i­cat­ed data cen­ters and cryp­to-cur­ren­cies, 2022 and 2026 (ref­er­ence sce­nario)9. Note: elec­tric­i­ty demand for data cen­tres excludes con­sump­tion by data net­work centres.

The pro­cess­ing needs of AI can be explained by dif­fer­ent phas­es. AI devel­op­ment involves an ini­tial learn­ing phase based on data­bas­es, known as the train­ing phase. Once the mod­el is ready, it can be used on new data: this is the infer­ence phase10. The train­ing phase has long been the focus of sci­en­tif­ic atten­tion, as it is the most ener­gy-inten­sive. But new AI mod­els have changed all that, as Alex de Vries explains: “With the mas­sive adop­tion of AI mod­els like Chat­G­PT, every­thing has been reversed and the infer­ence phase has become pre­dom­i­nant.” Recent data pro­vid­ed by Meta and Google indi­cate that it accounts for 60–70% of ener­gy con­sump­tion, com­pared with 20–40% for train­ing11.

Carbon neutrality: mission impossible for AI?

While AI’s ener­gy con­sump­tion is fraught with uncer­tain­ty, esti­mat­ing its car­bon foot­print is a chal­lenge for the sci­en­tif­ic com­mu­ni­ty. “We are able to assess the foot­print linked to the dynam­ic con­sump­tion of train­ing, and that linked to the man­u­fac­ture of com­put­er equip­ment, but it remains com­pli­cat­ed to assess the total foot­print linked to use. We don’t know the pre­cise num­ber of uses, or the pro­por­tion of use ded­i­cat­ed to AI on the ter­mi­nals used by users,” stress­es Anne-Lau­re Ligozat. “How­ev­er, col­leagues have just shown that the car­bon foot­print of user ter­mi­nals is not neg­li­gi­ble: it accounts for between 25% and 45% of the total car­bon foot­print of cer­tain AI mod­els.” Anne-Lau­re Ligozat and her team esti­mate that train­ing the BLOOM AI mod­el – an open-access mod­el – emits around 50 tonnes of green­house gas­es, or 10 times more than the annu­al emis­sions of a French per­son. This makes it dif­fi­cult for the tech giants to achieve their car­bon neu­tral­i­ty tar­gets, despite the many off­set­ting mea­sures they have tak­en. Google admits in its lat­est envi­ron­men­tal report: “Our [2023] emis­sions […] have increased by 37% com­pared to 2022, despite con­sid­er­able efforts and progress in renew­able ener­gy. This is due to the elec­tric­i­ty con­sump­tion of our data cen­tres, which exceeds our capac­i­ty to devel­op renew­able ener­gy projects.”

Lim­it­ing glob­al warm­ing means dras­ti­cal­ly reduc­ing glob­al green­house gas emis­sions. Is AI at an impasse? “None of the argu­ments put for­ward by Google to reduce AI emis­sions hold water” deplores Anne-Lau­re Ligozat. “Improv­ing equip­ment requires new equip­ment to be man­u­fac­tured, which in turn emits green­house gas­es. Opti­mis­ing infra­struc­tures – such as water cool­ing for data cen­tres – shifts the prob­lem to water resources. And the relo­ca­tion of data cen­tres to coun­tries with a low-car­bon elec­tric­i­ty mix means that we need to be able to man­age the addi­tion­al elec­tric­i­ty demand…” As for the opti­mi­sa­tion of mod­els, while it does reduce their con­sump­tion, it also leads to increased use – the famous ‘rebound’ effect. “This tends to can­cel out any poten­tial ener­gy sav­ings,” con­cludes Alex de Vries. “My main argu­ment is that AI should be used sparingly.”

Anaïs Marechal
1Stu­art J. Rus­sell, Peter Norvig, and Ernest Davis. Arti­fi­cial intel­li­gence: a mod­ern approach. Pren­tice Hall series in arti­fi­cial intel­li­gence. Pren­tice Hall Upper Sad­dle Riv­er, New Jer­sey, third edi­tion edi­tion, 2010.
2IEA (2024), Elec­tric­i­ty 2024, IEA, Paris https://​www​.iea​.org/​r​e​p​o​r​t​s​/​e​l​e​c​t​r​i​c​i​t​y​-2024, Licence: CC BY 4.0
3Web­site con­sult­ed on 26 Sep­tem­ber 2024 : https://​analy​seset​don​nees​.rte​-france​.com/​b​i​l​a​n​-​e​l​e​c​t​r​i​q​u​e​-​2​0​2​3​/​c​o​n​s​o​m​m​a​t​i​o​n​#​C​o​n​s​o​m​m​a​t​i​o​n​c​o​r​rigee
4D. Pat­ter­son et al., “The Car­bon Foot­print of Machine Learn­ing Train­ing Will Plateau, Then Shrink,” in Com­put­er, vol. 55, no. 7, pp. 18–28, July 2022, doi: 10.1109/MC.2022.3148714.
5Google, 2024, Envi­ron­men­tal report
6Strubell et al. (2019) Ener­gy and pol­i­cy con­sid­er­a­tions for deep learn­ing in NLP, arX­iv.
7De Vries, The grow­ing ener­gy foot­print of arti­fi­cial intel­li­gence, Joule (2023), https://​doi​.org/​1​0​.​1016/ j.joule.2023.09.004
8Source of first graph: ACM Con­fer­ence on Fair­ness, Account­abil­i­ty, and Trans­paren­cy (ACM FAc­cT ‘24), June 3–6, 2024, Rio de Janeiro, Brazil
9Source for sec­ond graph: IEA fore­casts based on data and pro­jec­tions from Data Cen­tres and Data Trans­mis­sion Net­works; Joule (2023) – Alex de Vries, The grow­ing ener­gy foot­print of arti­fi­cial intel­li­gence; Cryp­to Car­bon Rat­ings Insti­tute, Indices; Ire­land – Cen­tral Sta­tis­tics Office, Data Cen­tres Metered Elec­tric­i­ty Con­sump­tion 2022; and Dan­ish Ener­gy Agency, Den­mark’s Ener­gy and Cli­mate Out­look 2018
10Adrien Berth­elot, Eddy Caron, Mathilde Jay, Lau­rent Lefèvre, Esti­mat­ing the envi­ron­men­tal impact of Gen­er­a­tive-AI ser­vices using an LCA-based method­ol­o­gy, Pro­ce­dia CIRP, Vol­ume 122, 2024, Pages 707–712, ISSN 2212–8271
11Web­site con­sult­ed on 25/09/2024 : https://​www​.iea​.org/​e​n​e​r​g​y​-​s​y​s​t​e​m​/​b​u​i​l​d​i​n​g​s​/​d​a​t​a​-​c​e​n​t​r​e​s​-​a​n​d​-​d​a​t​a​-​t​r​a​n​s​m​i​s​s​i​o​n​-​n​e​t​works

Our world explained with science. Every week, in your inbox.

Get the newsletter