Sustainability

AI's Carbon Footprint

By Stephen Bolen,

Published on Mar 10, 2023   —   1 min read

InnovationTechnology
Photo by Damir Yakupov / Unsplash

Summary

“We’re talking about ChatGPT and we know nothing about it. It could be three raccoons in a trench coat.” - Sasha Luccioni

Bloomberg Green has an article about the energy consumption required to train AI models like ChaptGPT:

AI uses more energy than other forms of computing, and training a single model can gobble up more electricity than 100 US homes use in an entire year. Yet the sector is growing so fast — and has such limited transparency — that no one knows exactly how much total electricity use and carbon emissions can be attributed to AI. The emissions could also vary widely depending on what type of power plants provide that electricity; a data center that draws its electricity from a coal or natural gas-fired plant will be responsible for much higher emissions than one that draws power from solar or wind farms.

The article goes on to report that training GPT-3, a single general-purpose AI program that is used to generate language, took 1.287 gWh. That's enough electricity to power 120 U.S. homes over the course of one year - or a city the size of Kangley, Illinois.

“We’re talking about ChatGPT and we know nothing about it. It could be three raccoons in a trench coat.” - Sasha Luccioni

The size of the carbon footprint will vary based on the source of the electricity. To this end, many of the cloud computing providers that these AI models are using have net-zero pledges. Microsoft, Google, and Amazon have all staked a position to be carbon neutral or carbon negative by 2023. These pledges have led Microsoft to increase it's purchasing of renewable energy while working on ways to make Chat GPT/OpenAI projects more efficient, in both training models and application.

Share on Facebook Share on Linkedin Share on Twitter Send by email