Why artificial intelligence is an environmental disaster waiting to happen
With the rapid progression of artificial intelligence, the environmental impact of AI is starting to raise concerns with climate experts.
AI is sometimes imagined as an additional tool to mitigate human impacts on the planet. This includes fighting pollution, increasing the energy efficiency of cities, preserving the biosphere, or promoting more responsible agriculture. However, despite the potential benefits of AI, there is no hiding from the fact that advanced language models like ChatGPT, or Google's Bard, require vast amounts of energy, water, and the use of rare materials.
AI needs, among other things, enormous data centers, expensive and highly energy-consuming infrastructures. These data centers require colossal amounts of electricity (often from non-renewable sources) and water for their proper functioning, used for cooling.
In 2019, a team of researchers from the University of Massachusetts revealed a quite evocative figure. Simply training an AI costs in CO2 emissions the equivalent of 205 round trips from Paris to New York by plane. That is quite alarming. In 2023, the University of Colorado investigated the case of ChatGPT. Responding to 25 requests consumes the equivalent of 1.5 liters of freshwater. Considering that ChatGPT has around 200 million regular users, the scale of the problem becomes evident.
The most significant problem lies not even in these staggering numbers. The more serious issue is that these figures are just rough estimates. Precisely assessing the ecological impact of AI is very complex. Stanford University, in June 2023, pinpointed the central problem: calculating the environmental footprint of AI is impossible due to the lack of data
Nonetheless, a recent paper published in the Joule journal, shows hypothetically that if every search on Google used a similar AI to ChatGPT, it would use over 10 times as much energy, and as a company, would use more electricity annually than the entire country of Ireland. The paper shows what the possible impact AI will have on the environment as it becomes more commonplace.
In the paper, they also look at a real world example of Nvidia and the A100 HGX servers that they are currently working on. It estimates that the energy consumption of 100,000 Nvidia AI servers would be around 8 TWh (Terawatt hour). In comparison to a typical data center, which have annual energy consumptions of 205 TWh. So whilst the levels are high, the Nvidia AI servers are still less than current data centers. But the AI industry is still only in its infancy, and it seems that now is an appropriate time to act before AI emissions grow exponentially.