Aerial shot of wind farm
An arrow pointing leftHome

Microsoft launches new tools to reveal the carbon footprint of AI

  • Hope Reese

With new research and a customer-facing tool, the tech giant is taking steps to help clients understand and reduce the emissions associated with AI.

As CO2 emissions continue to spike, and the Earth’s temperature rises along with it, the only real chance experts say we have to keep the heat to 1.5 degrees Celsius above pre-industrial levels is to get to net-zero emissions by 2050.

And while artificial intelligence is not notorious for being a major greenhouse gas emitter in the way other burgeoning technologies like, say, cryptocurrencies, are, AI also carries a carbon footprint. The large computational loads required in running complex models can result in higher energy consumption, for instance.

Just how much energy is being consumed by AI, however, has been difficult to measure, in part because the technology is so new and we don’t have accurate methods for doing the measuring.

Microsoft is now officially trying to change this. Its machine learning platform now displays energy consumption metrics to lend some transparency to AI operations on its cloud platform, Azure. Its aim, the company says, is “to help customers understand the computational and energetic costs of their AI workloads across the machine learning lifecycle.”

When running AI models, energy consumption can vary by region and time — down to the day and hour — as well as “low carbon generation (wind, solar, hydro, nuclear, biomass) and conventional hydrocarbon generation,” according to Microsoft.

Calculating these costs is what Microsoft calls “carbon-aware computing,” and this knowledge can lead users to make changes to improve consumption by rethinking geographic regions or setting a better time for training.

The Green Software Foundation is also offering tools to help — its specification for measuring Software Carbon Intensity (SCI) can measure consumption of Azure AI workloads, the press release states, through multiplication of cloud workload energy and carbon intensity of the grid. This SCI is now the subject of research by Microsoft and AI2, with Hebrew University, Carnegie Mellon University and Hugging Face, to “quantify the marginal change in emissions caused by decisions or interventions, or actions.”

“We trained 11 different ML models, and inferred the carbon footprint for each, for a number of different azure regions,” said William Buchanan, product manager for Azure Machine Learning at Microsoft. “Some regions had a larger footprint than others, and from there, we equated carbon into equivalencies (e.g. barrel car of coal).”

The authors then compared ways users could reduce their carbon footprint. Finding the right geographic region came in at number 1 — it was found to reduce SCI by nearly 75%. Time of day was another big factor. “For shorter training runs, we find reductions greater than 30% in multiple regions, and up to 80% in regions that have high renewable energy intermittency,” the report stated, while the reduction was minimal for long workloads.

Additionally, “dynamically pausing” workloads during periods of high carbon intensity, and unpausing when low, could also yield energy savings.

All of these carbon savings, it should be noted, are for a single training run for a ML model. Researchers still don’t have accurate figures for the carbon footprint for the full lifecycle, according to the release, which includes “the initial exploratory training phases all the way through hyperparameter tuning and deployment, and monitoring of the final model.”

Some companies that operate cloud platforms, Microsoft included, are using 100% carbon-neutral energy to power cloud computing data centers, using Renewable Energy Credits and Power Purchase Agreements as incentives. Microsoft notes that this is not the same thing, exactly, as running on clean energy. To achieve this, Microsoft is scheduling Windows updates during the periods of lowest marginal carbon intensity.

Calculating AI’s carbon footprint is becoming “big in the industry,” as Buchanan put it.

For instance, “Google has published a blog on ‘datacenters work harder when the sun shines brighter’ as well as a region selector that helps you optimize for carbon, cost and latency. There are many companies taking advantage of this through the Green Software Foundation’s Carbon Aware SDK project.”

As these efforts among companies continue, the demand for centralized and interoperable tools, which can help scale these efforts, will grow.