Authors
Alexandra Sasha Luccioni, Sylvain Viguier, Anne-Laure Ligozat
Publication date
2023
Journal
Journal of Machine Learning Research
Volume
24
Issue
253
Pages
1-15
Description
Progress in machine learning (ML) comes with a cost to the environment, given that training ML models requires computational resources, energy and materials. In the present article, we aim to quantify the carbon footprint of BLOOM, a 176-billion parameter language model, across its life cycle. We estimate that BLOOM's final training emitted approximately 24.7 tonnes of CO2eq if we consider only the dynamic power consumption, and 50.5 tonnes if we account for all processes ranging from equipment manufacturing to energy-based operational consumption. We also carry out an empirical study to measure the energy requirements and carbon emissions of its deployment for inference via an API endpoint receiving user queries in real-time. We conclude with a discussion regarding the difficulty of precisely estimating the carbon footprint of ML models and future research directions that can contribute towards improving carbon emissions reporting.
Total citations
202220232024268106
Scholar articles
AS Luccioni, S Viguier, AL Ligozat - Journal of Machine Learning Research, 2023
A Sasha Luccioni, S Viguier, AL Ligozat - arXiv e-prints, 2022