A UNESCO study presented at the AI Global Summit highlights the rapidly increasing energy consumption of artificial intelligence, doubling every 100 days. This exponential growth puts significant strain on global energy systems, water resources, and critical minerals, raising environmental and equity concerns.
The study proposes several solutions to curb this energy consumption. These include:
Implementing these changes could potentially reduce AI energy consumption by up to 90% without compromising performance. The study points out that current large language models (like ChatGPT) are generic and process vast amounts of information, contributing to the high energy usage.
The CEO of OpenAI, Sam Altman, revealed that each ChatGPT request consumes roughly 0,34 Wh of electricity - 10 to 70 times more than a Google search. A billion daily requests amount to the annual electricity consumption of three million Ethiopians.