This tool estimates how much Electricity your Chatbot Messages Consume

Ever Wonder How Much Electricity You’re Using when you Prompt, or Thank, An Ai Model? Hugging face engineer julien delavande did, so he built a tool To help Arrive at the answer.

AI Models Consume Energy Each Time They’re Run. They’re run on gpus and specialized chips that need a lot of power to carry out the associateed computational workloads at scale. It’s not easy to pin down model power consumption, but it’s widely Expected That growing usage of ai technologies will drive electricity needs to new heights in the next couple of years.

The demand for more power to fuel ai has been some companies to pursue Environmentally unfriendly Strategies. Tools like delavande’s aim to brings attentions to this, and perhaps give some ai users pause.

“Even small energy savings can scale up across millions of Queries – Model Choice or Output LENGTH CAN Lead to Major Environmental Impact,” Delavande and Tool’s Other Creators Wrote in a statement,

Delavande’s tool is designed to work with chat ui, an open-source front-end for models like meta’s llama 3.3 70b and google’s gemma 3. The tool estimates the energy consumption of metsages Model in Real Time, Reporting Consumption in Watt-Hours or Joules. It also also compares model energy usage to that of common household appliances, like microwaves and lives.

According to the tool, Asking llam 3.3 70b to write a typical email uses approximate Seconds.

It’s Worth Remembering that the tool’s Estimates are only that – Estimates. Delavande makes no claim that they’re incredibly precise. Still, they serve as a reminder that everything – chatbots inclined – has a cost.

“With projects like the ai energy score and broader research on ai’s energy footprint, we’re pushing for transparency in the open source Community. One day, Energy Usage Usage BES Visible As Nutrition Labels on Food! ” Delavande and His Co-Creators Wrote.

Leave a Comment