BERT consumes energy of transcontinental round-trip flight per person

Occurred: June 2019

Can you improve this page?
Share your insights with us

Google generative AI model BERT consumed the energy equivalent of a round-trip transcontinental flight for one person to train the model. 

University of Massachusetts, Amherst researchers performed a life cycle assessment for training several common large AI models, including BERT (with 100 million parameters). They found that the process can emit more than 626,000 pounds of carbon dioxide equivalent - nearly five times the lifetime emissions of the average American car (including the manufacture of the car). 

As part of the research, the researchers trained each of the AI model on a single GPU for up to a day to measure its power draw. They then used the number of training hours listed in the model’s original papers to calculate the total energy consumed over the complete training process. That number was converted into pounds of carbon dioxide equivalent based on the average energy mix in the US.

They also found that the computational and environmental costs of training AI language models grew proportionally to model size and then exploded when additional tuning steps were used to increase the model’s final accuracy.

Research estimates that the carbon emissions of a single generative AI query number is four to five times higher than that of a one search engine query. 

Databank

Operator:  
Developer: Alphabet/Google
Country: USA
Sector: Multiple
Purpose: Train language models
Technology: NLP/text analysis; Neural network; Deep learning; Machine learning
Issue: Environment
Transparency: Governance