DayFR Euro

Artificial intelligence: an environmental disaster

Can we continue to increase the performance of artificial intelligence without threatening our planet? In recent weeks, warning signs have multiplied. OpenAI, the creator of ChatGPT, recently warned the White House that each of its future data centers would likely require “ more energy than is used to power an entire city.”

The same month, energy company Constellation Energy and Microsoft signed a 20-year agreement for the reopening of the Three Mile Island nuclear power plant in Pennsylvania, site of the most serious radiological accident the United States has ever experienced. For its part, Google admitted that it was struggling to achieve its carbon neutrality goals.

“Using AI is a question of life and death for a company” (Emilie Sidiqian, director of Salesforce)

The trend is becoming widespread in the United States, where most of the major generative AI models are developed. So much so that a BloombergNEF report warns the country's progress on decarbonization is slowing. Emissions would only be reduced by 34% by 2030 compared to their 2005 levels, a long way from the 50 to 52% target. The cause according to this report: AI's bulimia when it comes to energy.

A forced march of democratization

“All web giants invest in infrastructure, prioritizing model performance and neglecting energy costs,” summarizes Tristan Nitot, associate director at OCTO Technology, specialist in sustainable digital. A few weeks ago, Eric Schmidt, former Google boss and major investor in AI, brushed the subject aside during a conference. “We will not achieve the goals no matter what because we are not organized to. The energy requirements of AI will be problematic, but I'd rather bet on AI solving the environmental crisis than forcing it, and having the problem no matter what. »

But is this techno-solutionist bet tenable given the rapid democratization of artificial intelligence, and therefore its increasing energy cost? Generative AI is now being integrated into very mainstream products. Think of the new version of iOS, equipped with “Apple Intelligence” that quickly summarizes your notifications and messages in one sentence, or of Google’s AI Overviews which offers Internet users in around a hundred countries a summary concocted by the AI ​​of the results of their research… From now on, the use of ChatGPT and others raises questions. According to the International Energy Agency (IEA), a request to the OpenAI chatbot would consume almost ten times more electricity than a simple query on Google.

Added to this impact is the more well-known impact of training. This is the stage where AI models “read” a large amount of data and then practice answering queries. This work requires several days, even weeks of calculation, and a colossal mass of data. To make matters worse, the necessary infrastructure is mainly located in countries where energy remains carbon-intensive, such as the United States, Ireland, but also Germany.

How Theremia uses AI to optimize drug effectiveness

But as Tristan Nitot points out, “it’s not just a question of electricity.” Water consumption is also a central issue. To cool overheating data centers, Big Tech uses cooling towers that consume blue gold. Thus, Microsoft's water consumption increased by a third compared to 2021, and by 21% for Google. “We also need to think about the entire life cycle of AI, he adds. It starts with the extraction of metals, then their transport to the factories that manufacture the processors necessary for their development, in Taiwan in particular.” And processor consumption is exploding, driven by the renewal of computer hardware to make it compatible with AI functionalities.

Faced with this deadly observation, the tech industry is not remaining idle. Nvidia prides itself on constantly improving the energy efficiency of its GPUs, these chips essential for training the best AI. Data centers are adopting new cooling technologies called “direct cooling”, which consist of directly infiltrating a liquid at room temperature into the servers. A technique that consumes less water and electricity.

The fear of the rebound effect

“We are seeing impressive improvements but we have never consumed so much coal in human history. I don't see why AI would escape the Jevons paradox.” However, believes Gaël Varoquaux, researcher at Inria. This paradox, also called the “rebound effect,” states that as technological improvements increase the efficiency with which a resource is employed, the total consumption of that resource increases rather than decreases.

On the software side, progress is also visible. The trend is to develop small models, just as effective for certain use cases, but requiring up to ten times less computing power. These small AIs also have the advantage of being able to run on traditional computer hardware, such as a smartphone or a computer.

Beyond these technical advances, it will be necessary “discipline yourself” in the words of Gilles Babinet, president of the National Digital Council. “We must stop constantly updating models, because the energy cost of training them remains the major problem, favor more specific AI, and discriminate against certain stupid or unhelpful uses,” he says.

Artificial intelligence: the anger of artists

But while the digital giants wanted to sell us generative AI everywhere, these less useful uses are already part of the daily lives of some. Pierre-Yves Oudeyer, research director at Inria who is particularly interested in the use of generative AI among young people, observes that this technology sometimes serves them as a“universal interface”. They ask him for information such as the result of a basketball match, the capital of a country, which a simple Google query would have easily provided. “It’s like using a bulldozer to squash a fly,” he summarizes. For him, educating users about environmental issues is essential to rationalizing uses. After the shame of flying, are we going to feel guilty for every ChatGPT request?

-

Related News :