ChatGPT training emits 502 metric tons of carbon

Occurred: November 2022

Can you improve this page?
Share your insights with us

GPT-3 released over 500 metric tons of carbon during its training, according to research by open-source AI community Hugging Face. 

The researchers calculated that GPT-3, the model that powers ChatGPT, emitted around 502 metric tons of carbon, far more than other large language models. GPT-3’s vast emissions can likely be partly explained by the fact that it was trained on older, less efficient hardware, the researchers argued.  

The researchers had first estimated the whole life cycle carbon emissions of BLOOM, its own own large language model, calculating that it had led to 25 metric tons of carbon dioxide emissions - a figure that doubled when the emissions produced by the manufacturing of the computer equipment used for training, the broader computing infrastructure, and the energy required to run BLOOM once it was trained, was taken into account.

Databank

Operator: Alexandra Sasha Luccioni, Sylvain Viguier, Anne-Laure Ligozat
Developer: OpenAI
Country: Global
Sector: Multiple
Purpose: Generate text
Technology: Chatbot; NLP/text analysis; Neural network; Deep learning; Machine learning; Reinforcement learning
Issue: Environment
Transparency: Governance