Around us, we observe digital technology invading all other sectors. Artificial intelligence (AI) is one of the last links in this technological upheaval: it now accompanies all automatic processing that exploits the deluge of digital data. But given the ecological challenges we face today, will it be possible to design an AI that respects environmental constraints?
Before getting into the subject of frugal AI, it is important to set the scene. The unprecedented climate crisis we face began with the industrial revolution in the mid-19th century, which planted the seeds of our current consumer society. Climate change is not the only environmental threat: water stress, resource depletion, loss of biodiversity… But it is undoubtedly the most visible and the most documented, and therefore the one that can help us better understand others .
A sector that is growing ever faster
The digital sector is not easy to understand, because it is diluted everywhere. According to ADEME, it represents 2.5% of France’s carbon emissions in 2022. In recent years, the field has experienced strong growth and prospective studies mainly consider scenarios for continued growth, at least in the medium term. . A small calculation carried out from public data on the IPCC SSP1-19 scenario, one of the most optimistic, highlights the aberration of this growth. If the sector grows according to the lowest growth forecast, digital technology would emit 6 times more than the objective of the scenario of decreasing global CO2 emissions by 2050! Even if the sector’s growth stagnated at today’s levels, it would represent three quarters of total emissions… In such a world, what would we have left for the rest?
If we focus on AI, we observe a clear break from 2012. Growth in the sector then accelerates with a doubling of computing power needs every 5-6 months instead of 24 months, a figure up to then stable of Moore’s classic empirical law. This date corresponds to the development of AI models based on deep learning, made possible by the use of graphics processors (GPU) to carry out the calculations underlying deep learning and by the development of open data on the Internet. Remember that AI is not reduced to learning by deep neural networks, but it is undoubtedly the latter which are the most demanding. A new level was reached in 2023, with the explosion of generative models like the ChatGPT conversational agent. Even if it is difficult to put forward precise figures, given that the “tech giants” like OpenAI, Meta or Microsoft who are at the origin of the biggest models no longer communicate on this data, this wide distribution scale is very worrying.
The weight of generative AI on the climate
ChatGPT is based on the GPT-3 model, replaced today by an improved version GPT-4. It’s not the only one, but it’s the most popular and one for which there is data. The model on which it is based has 176 billion parameters and required 552 tonnes of CO2 equivalent for its training in California. In terms of electricity consumption (a more objective indicator in the sense that it does not depend on the energy mix), the model ran for days on nearly 4,000 large Nvidia GPUs whose consumption was estimated at 1,283 MWh (megawatt-hour, or 1,000 kWh).
The usage phase is even more consuming! Every day, the approximately ten million users use 564 MWh of electricity. The recent announcements from the bosses of OpenAI and Microsoft on orders for hundreds of thousands of GPUs to power future versions are dizzying in terms of consumption and environmental impact. With its current production capacity, the manufacturer Nvidia is far from being able to produce as many. ChatGPT is only the visible element of this galaxy. Today, AI is driving the exponential growth of the digital sector, with an explosion in the number of applications and services that use generative AI. The development of AI at this pace is of course not sustainable as it stands.
How to think about a more frugal AI?
We can only sustain this growth if AI enables considerable emissions savings in all other sectors. It is the majority voice which carries the message of an AI which will help us emerge from the crisis. Despite too many useless or questionable applications, there are beneficial contributions for society, particularly for simulating and analyzing complex physical phenomena such as the study of scenarios to counter the climate crisis. It is still necessary that these solutions are ultimately not worse than evil! For example, AI will allow companies exploiting fossil fuels to optimize their activity and therefore emit even more CO2. Everywhere, we hear about frugal AI without this term being clearly defined. In everyday language, sobriety is often understood as the appropriate reaction to excessive alcohol consumption. In the context of AI, this refers more to simplicity (which is clearly insufficient here), moderation, or even abstinence. Frugality and sobriety are often considered synonymous; it is also possible to consider that frugality concerns the functioning of technical systems while sobriety refers to their use within the framework of social practices.
The two dimensions complement each other in the sense that any technical system is aimed at uses which are thus facilitated and encouraged. Thus, the more the system appears conducive to use, the more its impact increases: this is what we call the rebound effect. However, the most relevant is the implicit definition: the opposite of frugality is thus qualified as gluttony according to Le Robert. It is therefore possible to consider frugality-sobriety as a virtue which is appreciated negatively, depending on the quantity of resources that one does not consume. However, characterizing frugal AI proves difficult for several reasons. On the one hand, existing analyzes often target model training and/or the usage phase, but ignore the complete life cycle of the service or product. This includes the production, use and storage of data, and the hardware infrastructure implemented, from manufacturing to the end of life of all equipment involved. On the other hand, for a service recognized as useful for society, it would be appropriate to estimate the volumes of data involved in the process and the indirect positive effects induced by its deployment. For example, an energy optimization system for an apartment can allow an increase in comfort or the deployment of new services thanks to the savings made.
AI on a diet, an insufficient approach
Today, the terms frugality or sobriety are often synonymous with energy efficiency: we imagine and develop a solution without taking into account its environmental cost, then we improve it from this point of view in a second step. On the contrary, we should question the effects upstream before deploying the service, even if it means giving it up. Frugal AI is therefore characterized by an intrinsic contradiction, given the glut of energy and data today necessary for training large models and their uses, disregarding the considerable risks for the environment. When it comes to AI, frugality must go much further than simple efficiency: it must first be compatible with planetary limits. It must also question uses upstream, up to the point of abandoning certain services and practices, based on complete and rigorous life cycle analyses.
The purposes covered by these technological developments should at least be collectively debated. Behind the argument of increased efficiency lies competition between national sovereignties or competition between firms interested in colossal profits. There is nothing in these goals that is not considered in the light of an ethical approach. An evaluation of algorithm systems using contemporary environmental ethics even makes it possible to base the notion of sobriety on other bases. Indeed, and despite their variety, these ethics do not consider Nature (water, air, materials and living things) as resources available only to the human species, engaged in technological competition and industrial hedonism. In conclusion, we could affirm that today a prospect opens up for responsible research in AI that is as formidable as it is difficult to achieve: proposing models and systems that are as compatible as possible with such a “strong” definition of sobriety.
The original version of this article was published on The Conversation
Related News :