La domanda di energia per l'IA raddoppia ogni 100 giorni - Notizie - Ansa.it

See original article

Key Findings

A UNESCO study presented at the AI Global Summit highlights the rapidly increasing energy consumption of artificial intelligence, doubling every 100 days. This exponential growth puts significant strain on global energy systems, water resources, and critical minerals, raising environmental and equity concerns.

Solutions

The study proposes several solutions to curb this energy consumption. These include:

  • Shorter user prompts to reduce computational demands.
  • Using smaller, more specialized AI models.
  • Reducing chatbot prompts from 300 to 150 words.

Implementing these changes could potentially reduce AI energy consumption by up to 90% without compromising performance. The study points out that current large language models (like ChatGPT) are generic and process vast amounts of information, contributing to the high energy usage.

Illustrative Data

The CEO of OpenAI, Sam Altman, revealed that each ChatGPT request consumes roughly 0,34 Wh of electricity - 10 to 70 times more than a Google search. A billion daily requests amount to the annual electricity consumption of three million Ethiopians.

Sign up for a free account and get the following:
  • Save articles and sync them across your devices
  • Get a digest of the latest premium articles in your inbox twice a week, personalized to you (Coming soon).
  • Get access to our AI features