ChatGPT training estimated to emit 502 metric tonnes of carbon

Occurred: November 2022

Report incident ๐Ÿ”ฅ | Improve page ๐Ÿ’ | Access database ๐Ÿ”ข

GPT-3 released over 500 metric tons of carbon during its training according to researchers, raising concerns about the environmental impacts associated with developing large language models.

Researchers at open-source AI community Hugging Face calculated that GPT-3, the model that powers ChatGPT, emitted around 502 metric tons of carbon, far more than other large language models.ย 

GPT-3โ€™s vast emissions can likely be partly explained by the fact that it was trained on older, less efficient hardware, the researchers argued.

The researchers had first estimated the whole life cycle carbon emissions of BLOOM, its own own large language model, calculating that it had led to 25 metric tons of carbon dioxide emissions - a figure that doubled when the emissions produced by the manufacturing of the computer equipment used for training, the broader computing infrastructure, and the energy required to run BLOOM once it was trained - were taken into account.

Environmental impacts of artificial intelligence

The environmental impacts of artificial intelligence (AI) may vary significantly. Many deep learning methods have significant carbon footprints and water usage.

Source: Wikipedia ๐Ÿ”—

System ๐Ÿค–

Operator: Alexandra Sasha Luccioni, Sylvain Viguier, Anne-Laure Ligozat
Developer: OpenAI
Country: Global
Sector: Multiple
Purpose: Generate text
Technology:Chatbot; Generative AI; Machine learning
Issue: Environment