Bloomberg Green has an article about the energy consumption required to train AI models like ChaptGPT:
AI uses more energy than other forms of computing, and training a single model can gobble up more electricity than 100 US homes use in an entire year. Yet the sector is growing so fast — and has such limited transparency — that no one knows exactly how much total electricity use and carbon emissions can be attributed to AI. The emissions could also vary widely depending on what type of power plants provide that electricity; a data center that draws its electricity from a coal or natural gas-fired plant will be responsible for much higher emissions than one that draws power from solar or wind farms.
The article goes on to report that training GPT-3, a single general-purpose AI program that is used to generate language, took 1.287 gWh. That's enough electricity to power 120 U.S. homes over the course of one year - or a city the size of Kangley, Illinois.
The size of the carbon footprint will vary based on the source of the electricity. To this end, many of the cloud computing providers that these AI models are using have net-zero pledges. Microsoft, Google, and Amazon have all staked a position to be carbon neutral or carbon negative by 2023. These pledges have led Microsoft to increase it's purchasing of renewable energy while working on ways to make Chat GPT/OpenAI projects more efficient, in both training models and application.