It’s no surprise that AI has a carbon footprint, which refers to the amount of greenhouse gases (carbon dioxide and methane, primarily) that producing and consuming AI releases into the atmosphere.
In fact, training AI models requires so much computing power, some researchers have argued that the environmental costs outweigh the benefits. However, I believe they’ve not only underestimated the benefits of AI, but also overlooked the many ways that model training is becoming more efficient.
Greenhouse gases are what economists refer to as an “externality” — a cost borne inadvertently by society at large, such as through the adverse impact of global warming, but inflicted on us all by private participants who have little incentive to refrain from the offending activity.
Typically, public utilities emit these gases when they burn fossil fuels in order to generate electricity that powers the data centres, server farms, and other computing platforms upon which AI runs.
Consider the downstream carbon offsets realised by AI apps
During the past few years, AI has been unfairly stigmatised as a major contributor to global warming, owing to what some observers regard as its inordinate consumption of energy in the process of model training.
Unfortunately, many AI industry observers contribute to this stigma by using an imbalanced formula for calculating AI’s overall carbon footprint. For example, MIT Technology Review published an article a year ago in which University of Massachusetts researchers reported that the energy needed to train a single machine learning model could emit carbon dioxide at nearly five times the lifetime emissions of the average American car.
This manner of calculating AI’s carbon footprint does the technology a huge disservice. At the risk of sounding pretentious, this discussion suggests Oscar Wilde’s remark about a cynic being someone who “knows the price of everything and the value of nothing.”
I’m not taking issue with the UMass researchers’ finding on the carbon cost of AI training, or with the need to calculate and reduce that cost for this and other human activities. I am curious why the researchers didn’t also discuss the value that AI provides downstream, often indirectly, in reducing human-generated greenhouse gases from the environment.
If an AI model delivers a steady stream of genuinely actionable inferences over an application’s life, it should generate beneficial, real-world outcomes. In other words, many AI apps ensure that people and systems take optimal actions in myriad application scenarios.
Many of these AI-driven benefits may be carbon-offsetting, such as reducing the need for people to get in their cars, take business trips, occupy expensive office space, and otherwise engage in activities that consume fossil fuels.
Perhaps a quick “traveling salesman” thought experiment is in order. Let’s say that a manufacturing company has a national sales force of six people, and each has a company-provided car.
If the company implements a new AI-based sales force automation system that enables one of those individuals to do the work of the entire team—such as through improved lead prospecting and route optimisation—that organisation could conceivably dismiss the other five individuals, scrap their company cars, and close their respective branch offices.
So, in one fell swoop, the five-car carbon footprint of the AI model at the heart of the sales force automation app would be entirely offset (and then some) by eliminating the greenhouse gases of exactly five cars, as well as the electricity savings from closing those offices and associated equipment.
We might quibble over the feasibility of this particular example, but we must admit that it’s entirely plausible. This thought experiment highlights the fact that AI’s productivity, efficiency, and acceleration benefits often produce downstream efficiencies in energy utilisation.
I’m not going to argue that every AI application—or even most of them—has a substantial downstream impact on reducing carbon emissions.
But I do take issue with observers, such as an AI expert quoted in this recent Wall Street Journal article, who trivialise the productivity impact from AI with statements such as: “If people could see the true cost of these systems, I think we’d have a lot of harder questions about whether that convenience [of AI-based digital assistants, for example] is worth the planetary cost.”
Sentiments such as these obscure the fact that (to use digital assistants as an example) many real-world AI use cases deliver “convenience” in the form of data-driven recommendations that help people buy the right product, take the optimal route to their destination, follow the best practice in managing their finances, and so on.
Many of these actionable recommendations may have an impact—large or small—on the energy that people use in their homes, offices, cars, and elsewhere.
Upstream AI training may drive greater downstream carbon offsets
Many AI apps have the potential to generate downstream carbon offsets that counterbalance the emissions associated with electricity needed to train the underlying models. If AI lets us do more work with only a fraction of the office space, meetings, and travel, the technology will be contributing mightily to the battle against global warming.
Consequently, achieving carbon-neutrality in AI apps may very well depend on intensively training the underlying models to be more effective at their assigned tasks. Equivalent to a capital investment, a well-trained AI model may be amortised over time through deployment into future applications.
Bear in mind that even as AI developers seek to improve their models’ accuracy, training is not necessarily the resource hog we’ve been led to believe. Consider the following trends that are reducing the carbon footprint of this and other AI pipeline workloads:
- AI server farms powered by renewable energy sources instead of fossil fuels
- More energy-efficient chipsets, servers, and cloud providers in AI platforms
- Less time and data needed to train AI models
- Greater adoption of pretrained AI models in real-world apps
- Comparisons of the energy efficiency of different models within the AI devops pipeline
- Development of AI on “once-for-all” neural networks that can be trained to run with maximum efficiency on many different kinds of processors
As these trends converge during the next several years, we’re likely to see dramatic drops in the carbon footprint associated with AI training. As that trend intensifies, AI pipelines will become the most environmentally sustainable platforms in the IT universe.