New Tool Estimates Energy Consumption of AI Messages

Julien Delavande, an engineer at Hugging Face, has developed a truly groundbreaking tool. This calculator helps estimate the energy use associated with messages to and from AI models. This tool is designed to work seamlessly with Chat UI, an open-source front-end created for advanced models such as Meta’s Llama 3.3 70B and Google’s Gemma 3….

Lisa Wong Avatar

By

New Tool Estimates Energy Consumption of AI Messages

Julien Delavande, an engineer at Hugging Face, has developed a truly groundbreaking tool. This calculator helps estimate the energy use associated with messages to and from AI models. This tool is designed to work seamlessly with Chat UI, an open-source front-end created for advanced models such as Meta’s Llama 3.3 70B and Google’s Gemma 3.

Delavande’s tool calculates energy usage in real-time, providing users with metrics in either Watt-hours or Joules. Our AI energy impact initiative has the goal of shining a deeper light on the energy and climate impacts of these emerging ai technologies. As adoption accelerates, these energy demands are poised to increase rapidly in the coming years.

Delavande’s tool provides down-to-earth estimates of energy use that can be concretely applied. For example, it calculates that calling Llama 3.3 70B to write a standard email takes about 0.1841 Watt-hours. All that energy can run a 1000-watt microwave for 0.12 seconds! It can actually power a toaster for a full 0.02 seconds, too.

Delavande believes this kind of transparency is essential, especially as AI technologies are increasingly adopted and applied to real-world implications. He highlighted that they are forcing transparency among the open-source community. One way they’re doing this is through projects like the AI energy score and more general research into AI’s energy footprint. Perhaps one day, energy use will be as transparent as nutrition labels on food!

Delavande and his co-creators have released the tool on Hugging Face’s platform. Their goal, they say, is to increase transparency around energy impact of AI use cases. They are quick to stress that although the tool does offer great estimates, it does not purport to be super duper accurate.

Even modest energy savings can make a big difference when multiplied across millions of queries. Every shift in model selection or increase in output generation can greatly affect the environmental footprint, especially at scale.

Delavande and his partner, a music therapist, currently live in Manhattan. He’s passionate about helping organizations understand the environmental impact of AI technologies. As demand for these technologies increases, developers have to better understand their energy consumption. Users will need this understanding to be effective.