Catégorie : IA

What we can say for sure about AI’s ever-increasing impacts

Follow our professional training on frugal AI to build more sustainable and environmentally friendly AI.

87%. That’s how much the emissions of NVIDIA (the world’s biggest manufacturer of GPUs for AI) increased in 2024. A fact they’re hiding well: this article by TruthDig is one of the only sources in the world to point out.

This means NVIDIA became the world’s most valuable company (worth 5 trillion dollars) by answering soaring demand for AI… whilst doubling its carbon footprint.

Not to mention water: Samsung’s next ‘mega-cluster’ of GPU fabs will consume half of Seoul’s water, says the TruthDig article.

Energy, and related emissions, are often cited as the key impacts of technology, as they are the easiest to understand.

However, they only represent 30% of total impacts, as GreenIT.fr’s latest AI report confirms. 

If you take AI’s environmental impacts as a whole – 16 criteria known as ‘PEF’, which include not only GHGs, but also water usage, air and soil pollution and more – these are set to grow sevenfold by the end of the decade.

This is precisely why I conceived the “Frugal AI” training day for GreenIT.fr: because the true impacts of generative AI (ChatGPT & co) are either massively unknown, or seriously misunderstood.

Largely because the companies behind the AI wave do all they can to minimise these impacts. For example, by saying that one prompt uses tiny amounts of energy and water. Except ChatGPT receives 2.5bn prompts per day.

So any “per user”, or “intensity” claims are to be taken with as much circumspection as Ryanair’s spurious assertion a few years back that they were the world’s greenest airline because they had the lowest emissions per passenger (yes, they got fined for that. OpenAI or Google? Not yet…)

So, what do we know for sure?

Easiest answer: not much. Especially when Google pretends to be super transparent with an apparently very scientific white paper, that essentially says no more than what OpenAI CEO Sam Altman did: “it’s tiny! Move on…”

My tip is always to avoid getting bogged down in intensity stats like “one ChatGPT prompt = ten Google Searches”. Not only is that figure made up, it depends on too many variables – e.g. where the compute work was carried out – to be accurate.

Stick to the undeniable facts. Which are:

  • GPUs consume at least 4x more electricity than comparable CPUs, and potentially up to 6x more (source). And because apparently you can’t do generative AI without them (not true; CPUs can do inference) and because apparently the bigger the model, the more powerful it is (not true either), this factor is only going to get worse
  • Electricity consumed by data centres is going to – triple in the US by 2028; triple in Europe by 2030; quadruple in the UK by the same year; and nearly quadruple in France by 2035 – because of AI (source, incl. of above graph; source, and source)
  • Water consumed by data centres in the US could up to quadruple by 2028 (source) – and that doesn’t include the water needed to make the GPUs in the first place (cf. above)
  • E-waste – we’re looking at an extra 5 million tonnes because of AI, on top of the global 60 million. That’s the world’s fastest-growing type of waste; it’s only going to get worse as AI hardware ‘has’ to be changed every two years; and only 22% of e-waste is currently recycled (source and source)
  • Local impacts: Elon Musk was in such a rush to build the world’s biggest AI supercomputer (xAI’s Colossus, 100k GPUs) that, rather than wait the year or two it takes to get hooked up to the local grid, he shipped 33 methane generators (yes, methane) into his Memphis data centre. Result: DC built in just months, as he proudly bragged. But, shucks, it also caused a 79% uptick in nitrous oxide pollution in its first year; and that’s in an underprivileged neighbourhood that already has cancer rates 4x higher than the rest of the US (source)
  • Fossil fuels: thought they were a thing of the past? Think again. See that massive uptick in the electricity graph above? What energy source could possibly fill that demand quick enough? Not nuclear, as it takes at least ten years to build a reactor. Ergo: Meta’s next DC, its biggest ever (the size of Manhattan) will be powered by gas. And one third of the US’ coal power stations will stay open, ‘thanks’ to Trump (source). TL;DR: AI is slowing down the energy transition.
  • GAFAM, going the wrong way – since they started investing hundreds of billions in AI, Microsoft’s emissions have gone up 29%; Google’s by 48%; and Amazon’s by 6% last year (though its staff claim this figure is closer to 35% over the past few years). This despite all of their pledges to be net zero, or negative by 2030, or 2040. Big tech has recently started moving away from such promises – cf. Microsoft and Google – (not very) surprisingly. 

Why does it matter?

Of course, the fact that ChatGPT consumes such massive resources is not its users’ fault. My objective is not to guilt users out for using it, even if way more frugal AI alternatives exist (more on those soon). But it’s important to understand where we could be going with all of the above.

As The Shift Project rightly points out, considering 20% of the industrial sector’s energy consumption could soon be devoted to data centres, because of AI, certain difficult choices will have to be made, both by the private and public sector. Wouldn’t that energy be put to better use powering electric cars? This was pretty much the decision taken by local authorities in Ireland, where DC buildouts got so out of control that they were consuming 20% of the Dublin region’s electricity. Their resulting decision? No DC buildouts from 2022 to 2028.

Similar decisions will have to be made with water; a considerably more limited resource than electricity (i.e. it’s much harder to generate more of it). Think of that couple in the US whose taps dried up when Meta moved their DC in next door; or of those farmers in Taiwan, told to not water their crops three years running so that chip giant TSMC could continue supplying NVIDIA and co with GPUs. Or when Google had to backtrack on building a water-cooled DC in Uruguay when the country was so hit by drought, its people were drinking saltwater… 

Other impacts on local people should include electricity prices: in the US state of Maine, for example, they rose 36% in May 2025. Why? Because the ageing grid had to be upgraded to meet the considerable energy thirst of AI.

In short, expect more and more of the above to happen as the AI datacenter buildout boom continues (100 more are expected in the UK alone by 2030, for example. Are there enough resources to support them? Who knows…)

It’s already starting to emerge that GPUs are sitting idle because there’s not enough power to run them. Microsoft boss Satya Nadella practically admitted this the other day. And Bloomberg recently reported (via Tom’s Hardware) that data centre company Digital Realty currently has two DCs in California, representing a combined 100MW in capacity, sitting idle because they’re waiting for an electricity supply. The locality simply can’t provide the massive amounts of energy they need. This situation could persist for years, re. the report…

So is the energy starting to run out already? Have big tech already overprovisioned that much?

I could go on. But instead I’ll encourage you to:

  • Hold on for the next blogpost in this series
  • Sign up for a future edition of my “Frugal AI” training, here; it’s in French for now, but English versions are also possible!

James