From Structs and Lambdas to CO2 Emissions: Microsoft Software Development Gets Greener
As humanity plummets into a suicidal climate-change death spiral (some might say), Microsoft-centric software development is increasingly adopting "green" practices such as lowering CO2 emissions in order to slow the roll.
More and more, developers who previously worried about working with structs, lambdas and loops are now taking into account techniques to tune machine learning models and optimize network traffic to lessen environmental impacts.
Microsoft has undertaken myriad efforts that target sustainable software engineering and sustainable computing in general. In fact, the company is even taking the remarkable step of factoring in progress on sustainability goals in determining executive pay. That's probably certain to speed things up (more on that below).
Another example: Microsoft developer blogs now have a "Sustainable Software" section with posts published just in the last month including:
The introductory blog post of that series, in August 2020, was written by Scott Chamberlin, Principal Software Engineering Lead, who said, "Green Software Engineering is an emerging discipline at the intersection of climate science, software practices and architecture, electricity markets, hardware and datacenter design." Chamberlin pointed to eight Principles of Green Software Engineering that help guide Microsoft's efforts. It's a personal project guided by Asim Hussain, Green Cloud Advocacy Lead at Microsoft, who also contributes to the Sustainable Software blog. The eight principles are:
- Carbon: Build applications that are carbon efficient.
- Electricity: Build applications that are energy efficient.
- Carbon Intensity: Consume electricity with the lowest carbon intensity.
- Embodied Carbon: Build applications that are hardware efficient.
- Energy Proportionality: Maximize the energy efficiency of hardware.
- Networking: Reduce the amount of data and distance it must travel across the network.
- Demand Shaping: Build carbon-aware applications.
- Measurement & Optimization: Focus on step-by-step optimizations that increase the overall carbon efficiency.
Along with those principles, displayed on the Principles.Green site, are specific techniques to apply them. For example, it lists Web-Queue-Worker, N-tier and Microservices. Drilling down into those areas you can find guidance such as optimizing network traffic, increasing compute utilization, optimizing databases and more.
In drilling down further into the Web-Queue-Worker area (handling HTTP requests and handling time or processing-intensive operations) you can find network optimization techniques to reduce the amount of traffic an architecture creates per operation along with the distance each request and response travels. Specifically, those include:
- Consider using caching headers, which allows browser caches and proxy caches to have enough information to confidently cache static assets. Caching static assets at the browser or proxy level allows future requests for those assets to be handled by those caches and reduces network traffic to your application.
- Consider using a CDN to distribute your application's static assets closer to the source of a request. This distribution of assets reduces the distance all requests for static assets has to travel over the network.
Where possible, reduce the size and optimize your bundles and static assets.
- Consider using compression and decompression for data you transmit over the network. Compression and decompression usually takes less overall energy than transmitting uncompressed data over the network.
For hands-on learning of the above, Microsoft offers a course titled The Principles of Sustainable Software Engineering, consisting of 12 units that the company estimates will take 33 minutes to work through.
Following those eight principles and other guidance, Microsoft has approached sustainable software engineering on many fronts, including the Visual Azure post A Visual Guide To Sustainable Software Engineering, published a few months ago:
While most Microsoft nuts-and-bolts software development posts don't say much about green techniques or sustainable software engineering, Microsoft's Bill Johnson points out there is a role for individual software engineers. Noting the broader carbon reduction efforts of major cloud providers, Johnson said, "it's not always easy to draw a line between the code we write and sustainability efforts like these."
Johnson, Principal Software Engineering Manager on the Azure SRE team, positions software engineers on the technical level of three levels of sustainable engineering that also include operational (DevOps/site reliability engineer) and environmental (sustainability engineer). He said sustainable software engineering "is about finding the balance between the technical, operational, and environmental aspects of a system to provide an optimal level of sustainability."
"Technical sustainability covers the direct decisions we make for the system to produce its desired results," Johnson said. "This includes both hardware decisions (CPUs, memory, networks) and software decisions (language, architecture, complexity) as well as things like latency in the system, testing requirements, or the scale up/out requirements. You can loosely think of this as 'traditional' software engineering."
Individual Microsoft software developers/engineers/data scientists are indeed doing their part. One close-to-home example is Dr. James McCaffrey of Microsoft Research, a pre-eminent data scientist who writes The Data Science Lab in his role as a senior technical editor for Visual Studio Magazine. He was recently featured in a Pure AI article about intelligent sampling of huge machine learning datasets to reduce costs and maintain model fairness.
McCaffrey combined with fellow researchers Ziqi Ma, Paul Mineiro and KC Tung for a technique that also had green connotations, explained under the heading "Energy Savings and CO2 Emissions." It read:
Large, deep neural network machine learning models, with millions or billions of trainable parameters, can require weeks of processing to train. This is costly in terms of money as well as in associated CO2 emissions. Training a very large natural language model, such as a BERT (Bidirectional Encoder Representations from Transformers) model can cost well over $1 million in cloud compute resources. Even a moderately sized machine learning model can cost thousands of dollars to train -- with no guarantee that the resulting model will be any good.
It's not unreasonable to assume a near-linear relationship between the size of a training dataset and the time required to train a machine learning model. Therefore, reducing the size of a training dataset by a factor of 90 percent will reduce the time required to train the model by approximately 90 percent. This in turn will reduce the amount of electrical energy required by about 90 percent, and significantly reduce the amount of associated CO2 emissions.
For example, a commercial airliner flying from New York to San Francisco will emit approximately 2,000 lbs. (one ton) of CO2 into the atmosphere -- per person on the plane. This is a scary statistic. And unfortunately, it has been estimated that the energy required to train a large BERT model releases approximately 600,000 lbs. of CO2 into the atmosphere. In short, reducing the size of machine learning training datasets can have a big positive impact on CO2 emissions and their effect on climate conditions.
Those concerns were summarized by Tung, who said, "I was surprised to learn how much CO2 is released during machine learning model training. The current approach for building ML models is not sustainable and we will hit a ceiling soon, if not already."
For this article, McCaffrey shared his thoughts on the matter.
"Many of us at Microsoft watched as the company went all-in on cloud computing -- it was one of those key inflection points that large companies go through every 10 years or so," McCaffrey said. "I vividly remember seeing for the first time some photographs of one of Microsoft's datacenters -- with huge buildings and rack after rack of server machines. That image more or less galvanized many of us towards the reality of a new era of computer science where scales are gigantic -- quite a change from the days of early PCs.
"A natural consequence of large scale is large impact. One of the earliest projects that I worked with in my role as the director of the internal Microsoft AI School, was an effort to use machine learning to reduce the energy used by a Microsoft's datacenter in Quincy, Wash. There's a delicate balance between cooling the facility, which is very expensive, and saving cooling costs, which can result in increased hardware failures. This energy-saving effort, and others like it, naturally led to investigations of the impact on environmental factors, such as CO2 emissions and their impact on global climate. With large scale, a small improvement in efficiency can have a big impact."
Microsoft's software engineering greenness is part of a broader scope for all computing efforts that began well over a decade ago. A big milestone in that effort was a Jan. 16, 2020, post that said "Microsoft will be carbon negative by 2030." It started out: "The scientific consensus is clear. The world confronts an urgent carbon problem. The carbon in our atmosphere has created a blanket of gas that traps heat and is changing the world's climate. Already, the planet's temperature has risen by 1 degree centigrade. If we don't curb emissions, and temperatures continue to climb, science tells us that the results will be catastrophic."
Earlier this year, the company provided a one-year progress report, part of which listed these items:
- We forecast that in our first year we reduced Microsoft's carbon emissions by 6 percent, or roughly 730,000 metric tons.
- We have purchased the removal of 1.3 million metric tons of carbon from 26 projects around the world.
- We are committing to transparency by subjecting the data in our annual sustainability report to third-party review by the accounting firm Deloitte and to accountability by including progress on sustainability goals as a factor in determining executive pay, starting with our next fiscal year.
A key figure in Microsoft's efforts is Dr. Lucas Joppa, Chief Environmental Officer. As McCaffrey explained, "Lucas was very passionate about the importance of environmental issues and he quickly formed an organization within Microsoft to look at these issues."
In January Joppa commented on the one-year carbon negative progress report mentioned above: "As Microsoft's Chief Environmental Officer, I know it won't be easy to achieve these commitments. It will take the entire decade and it won't happen if we 'set it and forget it.' It will be the result of a decade of purposeful action to enact operational and systemic changes. But over the next decade we will act in accordance with what we think needs to be done today to create the world we need to be operating in by 2030."
About the Author
David Ramel is an editor and writer for Converge360.