ChatGPT training estimated to emit 502 metric tonnes of carbon
Occurred: November 2022
Report incident ๐ฅ | Improve page ๐ | Access database ๐ข
GPT-3 released over 500 metric tons of carbon during its training according to researchers, raising concerns about the environmental impacts associated with developing large language models.
Researchers at open-source AI community Hugging Face calculated that GPT-3, the model that powers ChatGPT, emitted around 502 metric tons of carbon, far more than other large language models.ย
GPT-3โs vast emissions can likely be partly explained by the fact that it was trained on older, less efficient hardware, the researchers argued.
The researchers had first estimated the whole life cycle carbon emissions of BLOOM, its own own large language model, calculating that it had led to 25 metric tons of carbon dioxide emissions - a figure that doubled when the emissions produced by the manufacturing of the computer equipment used for training, the broader computing infrastructure, and the energy required to run BLOOM once it was trained - were taken into account.
Environmental impacts of artificial intelligence
The environmental impacts of artificial intelligence (AI) may vary significantly. Many deep learning methods have significant carbon footprints and water usage.
Source: Wikipedia ๐
Research, advocacy ๐งฎ
Luccioni A.S., Viguier S., Ligozat A-L. (2022). Estimating the Carbon Footprint of BLOOM, a 176B Parameter Language Model
Page info
Type: Incident
Published: November 2023