Blog Details

ChatGPT Energy Consumption

ChatGPT and Energy Consumption: Myth or Reality?

ChatGPT and other AI models are often associated with high energy consumption, but according to new research, their energy use may not be as dramatic as previously assumed. An analysis by the nonprofit institute Epoch AI shows that ChatGPT’s energy consumption largely depends on the specific model and how it is used.

Is ChatGPT Really an Energy Hog?

For a long time, it was believed that ChatGPT required approximately 3 watt-hours of energy to answer a single query, which is ten times the energy needed for a standard Google search. However, Epoch AI claims that this number is overestimated.

Using GPT-4o as a reference model, researchers found that the average ChatGPT query consumes only 0.3 watt-hours—less than many common household appliances. “The energy use is not a big deal compared to using regular appliances, heating or cooling your home, or driving a car,” said Joshua You, a data analyst at Epoch AI.

AI and Environmental Impact

The increasing popularity of AI and its expansion into various industries have raised concerns about its environmental impact. More than 100 organizations recently called on AI companies and regulators to prevent new AI data centers from depleting natural resources.

Epoch AI’s analysis was motivated by older research that they believe no longer reflects the current state of AI technologies. For example, the widely cited 3 watt-hour estimate was based on outdated and less efficient AI chips.

How Will AI's Energy Needs Evolve?

While ChatGPT’s current energy use is relatively low, researchers expect it to increase as AI models become more advanced and capable of handling increasingly complex tasks.

According to a Rand report, by 2027, AI data centers could require nearly the entire electricity capacity of California in 2022 (68 GW). By 2030, training new AI models could consume energy equivalent to eight nuclear reactors (8 GW).

OpenAI and other AI companies are investing billions of dollars in new data centers to support the growing demand for AI models. The biggest challenge will be developing efficient models that do not require excessive amounts of energy.

How to Reduce AI Energy Consumption?

According to You, users can minimize their AI energy footprint in several ways:

  • Use smaller AI models like GPT-4o-mini,
  • Limit queries that require extensive computational power,
  • Avoid using AI for unnecessarily complex tasks.

As artificial intelligence becomes an integral part of our lives, it is crucial not only to use it wisely but also to find ways to make it more energy-efficient.

Share: