Original Source Here
Artificial Intelligence Has an Enormous Carbon Footprint
AI’s energy usage is a hot topic, but it is more complex than a simple comparison suggests.
“Training artificial intelligence is an energy-intensive process. New estimates suggest that the carbon footprint of training a single AI is as much as 284 tonnes of carbon dioxide equivalent — five times the lifetime emissions of an average car. Language-processing AIs underpin the algorithms that power Google Translate as well as OpenAI’s GPT-2 text generator, which can convincingly pen fake news articles when given a few lines of text .”
Articles like this one are found in abundance across the internet. They have everything to draw in the clicks, a catchy title, and an easy conclusion — Artificial Intelligence is dangerous and bad for the environment. The author even added in a little more scare factor by associating AI with fake news, something that is not even related to AI’s carbon footprint.
AI’s carbon footprint deserves to be scrutinized. In the paper “Energy and Policy Considerations for Deep Learning in NLP,” the authors quantified the CO2 emissions, power, and cost to train some of the state-of-the-art NLP models. Training a transformer model(large model used in language processing) using GPUs could produce an estimated 626,155 lbs of carbon dioxide emissions. They compared this to other benchmarks including the lifetime carbon footprint of a car (126,000 lbs) or the average human per year (36,156 lbs). The authors also made some actionable suggestions to reduce carbon dioxide emissions for AI in the future. The entire paper is worth taking a few minutes to read.
This paper and ones like it have prompted numerous articles regarding AI’s carbon footprint. Particularly the claim regarding training a model being worse than the lifetime impact of five cars. There is some shock factor in this, but maybe it is a poor comparison to make. By breaking the problem down further the true impact of AI can be more holistically understood.
The comparison of one model to five cars is a simple one that is relatively intuitive for readers to grasp. Many people own a car and the carbon footprint of the combustion engine has been thoroughly impressed upon society. In the US there are 1.88 cars per household (2017), totaling over 100 million registered vehicles . No one is training that many AI models. The model-to-car comparison is intuitive but lacks some substance.
Beyond the comparison of cars to models, it is important to understand the methods authors used to calculate CO2 emissions. The calculated training energy use was done using GPUs, as no information was available for specialized tensor processing units (TPUs) at the time of publication. Google has since released data showing TPUs are somewhere between 30–80 times more efficient than traditional GPUs . This alone greatly reduces the energy consumption of these models. Also, Google acknowledges traditional architectures require significantly more resources and the energy consumption is a major concern both for profitability and the environment.
Another challenge found in this paper is when calculating the total CO2 production of these models is the varying energy sources utilized. Smaller groups are more likely to be at the mercy of their local power provider for renewable energy sources. The authors chose to use a power source breakdown based on the average of the US which is appropriate given the available information. However large companies like Google, Facebook, Amazon, and Microsoft do not rely on the average US power provider. Instead, they have shifted to a much higher percentage of renewables, reducing carbon footprint even further[5,6]. These companies know that renewables are the future and have aggressively adopted them as primary power sources.
Even if we conclude that the true energy cost of these models is different than suggested, we can still push to lower the energy usage of usage and development. MIT researchers have explored the idea of reducing computational costs by sharing weights and architectures . Another development from MIT focuses on improving the deployment of these models and reducing the major energy costs normally required . Google’s DeepMind has used machine learning to reduce cooling energy requirements in data centers by 40% . These are just a tiny amount of the new research focused on reducing the carbon footprint of model development and deployment.
Finally, all the environmental impacts of model training and deployment focus on the impact of training because it can be directly quantified. But once these models are deployed, we need to ask the question, “Do these developments improve other efficiencies?”. Once deployed AI has a large potential to improve efficiencies and reduce environmental impacts in the industries using it. The Capgemini Research Institute reports:
“Across sectors, AI-enabled use cases have helped organizations reduce GHG emissions by 13% and improve power efficiency by 11% in the last two years. AI use cases have also helped reduce waste and deadweight assets by improving their utilization by 12% .”
Boston Consulting Group also emphasizes the effect of AI implementation:
“In our experience with clients, using AI can achieve 5% to 10% of that needed reduction — between 2.6 and 5.3 gigatons of CO2e .”
Additionally, the implementation of AI may shrink the environmental impacts of the auto and airline industries. The very industries used to benchmark AI’s carbon footprint could have their own impact reduced. McKinsey & Company details how much the auto industry could benefit from AI and companies like Palantir are partnered with Airbus to use AI and data to improve airline efficiencies [12,13].
Artificial intelligence will likely remain a controversial technology. The carbon footprint of any industry should be questioned and reduced when possible. Simple comparisons paint a shallow and poor picture of AI, but the image looks much brighter in a complete context.
 D. Lu, Creating an AI can be five times worse for the planet than a car (2019), NewScientist
 Strubell et al., Energy and Policy Considerations for Deep Learning in NLP (2019), cs.CL
 I. Wagner, Number of vehicles per household in the United States from 2001 to 2017 (2021), statista.com
 N. Jouppi, Quantifying the performance of the TPU, our first machine learning chip (2017), Google
 Efficiency (2021), Google Sustainability
 I, Gheorghiu, Facebook meets 100% renewable energy goal with over 6 GW of wind, solar, 720 MW of storage (2021), Utility Dive
 Plummer et al., Neural Parameter Allocation Search (2021), cc.LG
 Rob Matheson, Reducing the carbon footprint of artificial intelligence (2020), MIT News
 R. Evans, J. Gao, DeepMind AI reduces energy used for cooling Google data centers by 40% (2016), The Keyword (Google)
 Capgemini Research Institute, Climate AI (2020)
 C. Degot, Reduce Carbon and Costs with the Power of AI (2021), Boston Consulting Group
 M. Breunig, Building smarter cars with smarter factories: How AI will change the auto business (2017), McKinsey Digital
 Palantir, Skywise (2021)
Trending AI/ML Article Identified & Digested via Granola by Ramsey Elbasheer; a Machine-Driven RSS Bot