Artificial Intelligence and energy consumption

Maria Demertzis*

 

On 8 December, the EU finally concluded talks on the Artificial Intelligence act, which aims to govern the big artificial intelligence (AI) models that may pose systemic risks to our societies and to protect their impact on fundamental rights. For all its good intentions, it has given little attention to one of AI’s unintended consequences: the amount of energy that AI models consume in order to learn.

One estimate, published recently in an academic paper, is that by 2027, AI servers could consume similar levels of electricity as Argentina consumes in a year. This is equivalent to about 0.5% of the world’s electricity use. If Google switched its entire search business to AI, this paper argues, it would almost double its energy consumption, equivalent to what Ireland consumes in a year.

Energy consumption is, therefore, a strange omission from the AI act given how centre-stage this discussion had been for another digital innovation: decentralised ledger cryptocurrencies like Bitcoin. Critics are very keen to point out how energy-inefficient this technology can be. The global energy that bitcoin mining uses in any given year is greater than Argentina’s annual energy consumption.

AI’s energy consumption is bound to get bigger. When OpenAI first introduced ChatGPT, it reached 100 million users in just two months. In December of 2023, the platform had more than 180 million users. This is just one application. When considering the myriad of applications that exist, other than large language models for which AI can and will be used, the potential is endless. But the algorithms need to be trained using millions of chips that all use energy.

Two things are certain to happen. The use of AI technology will continue to increase in the future. But so will technological innovation that delivers energy efficiency. The difficulty is to evaluate how quickly one will compensate for the other.

In the meantime, however, electricity consumption at this level globally relies heavily on fossil fuels, and until recently, this consumption was paid for at exceptionally high prices. There are additional ethical reasons that are relevant, beyond the ones that we usually associate with the use of AI.

For example, ChatGPT’s significant increase in traffic came at the end of 2022, in the middle of the global cost-of-living crisis which was a direct result of soaring electricity prices. Some countries were unable to afford energy bills because of the prices charged, leading to energy outages. Meanwhile, in the most privileged parts of the world, programmers were perfecting their code, students were writing the perfect essay and speechwriters searched for outlines for their speeches, all while using ChatGPT. The ethical use of electricity will be one of the many controversies for which 2022 will be remembered.

If there is a cost to be borne by all of us globally, how do we ensure that the benefits of such innovation are also distributed similarly?

No one can put a cap on innovation, or its uses. However, it is not appropriate that such an energy-intensive innovation expands when the objective is to save energy and move to renewables. Price has proven an insufficient instrument for ensuring the equitable allocation of energy, particularly when the system is under duress. AI will prove yet another hurdle in the discussions for a fair and equitable energy transition if these questions are unanswered.

 

*Maria Demertzis is a Senior fellow at Bruegel and part-time Professor of Economic Policy at the School of Transnational Governance at the European University Institute in Florence. The article was originally posted by Bruegel. It is also posted on the blog of the Cyprus Economic Society.

Related Posts