Why Your Organization Needs Sustainable Software | Rectangle

Related Expertise: Artificial Intelligence, Digital Transformation, Organizational Culture

Why Your Organization Needs Sustainable Software

By Abhishek GuptaSylvain DurantonMatthew Kropp, and Niels Freier

Software improves business efficiency and, in many cases, helps reduce energy waste and carbon emissions. But the ever-increasing energy needs of data centers underline that running software also causes carbon emissions. As businesses and other organizations turn to artificial intelligence, the situation could worsen because of AI’s need for processor-intensive “training.”
  • Many emissions are generated in cloud data centers and are therefore hidden in the company’s digital supply chain. Getting data on these emissions, even though it may be imprecise, is the first step to understanding the issue.
  • More efficient processing may make emissions rise, not fall, because of the so-called Jevons paradox. That’s because the efficiency gains from new hardware may be more than canceled out by greater software use.
  • The impetus should come from the top. Development teams are open to writing more efficient software but may need high-level backing for change.
Software emissions will rise up the agenda as companies work their way through other, more significant sources of emissions. Customers and other stakeholders have yet to wake up to this issue; taking action today means companies can have good answers ready when the problem of sustainable software becomes widely known.

Software drives today’s organizations but is a surprisingly large source of carbon emissions as well. It’s time for business leaders to act before the cost of fixing the problem rises sharply.

The issue is becoming more pressing as businesses and other organizations accelerate their deployment of artificial intelligence (AI). This brings substantial benefits but also a massive increase in the amount of processing power consumed. Unless companies improve the sustainability of their software today, they risk developing software that will run hot unnecessarily for many years. The cost could be substantial, in terms of bills to cloud providers as well as businesses’ ability to meet the emissions reduction targets many have set for themselves. It’s important to underline that the technology and communications sector uses a lot more power—and therefore emits a lot more CO2—than many realize.

Data Centers and the Thirst for Electricity

To see the power-hungry nature of today’s software, look at the ever-rising energy needs of data centers, where much modern software is run. Data centers worldwide used 220 to 320 terawatt-hours of electric power in 2021, according to the International Energy Agency (IEA)—more than the entire power consumption of, say, South Africa, Sweden, or Egypt. (This figure excludes the power consumed by cryptocurrency mining, which we don’t discuss in this article.)

Demand is growing so fast that popular locations such as Ireland and Singapore have paused new data center approvals because of the strain on their national power systems, according to the Irish Times and Channel News Asia, respectively. But these data centers are only part of the total emissions from the overall technology and communications sector. It’s not easy to assess the broader picture—and even harder to calculate the impact on emissions. Still, a 2018 study in the Journal of Cleaner Production estimated that the sector’s power consumption in 2040 will be double the 2020 levels, meaning that tech and communications would be responsible for 14% of total global carbon emissions.

The Hidden CO2 Emissions

Firms whose business is founded on information processing, such as some in financial services, should already be aware of the impact their operations have on emissions. But for many companies the issue is hidden, for a variety of reasons. If software is run in the cloud, for instance, any emissions become a supply chain issue. And some organizations may be so busy focusing on their own direct emissions right now that their indirect emissions are not a priority.

Although certain companies may recognize the issue yet simply have different priorities today, software emissions will become a more significant part of their total as they reduce emissions in other areas. Take, for example, a company with a vehicle fleet. The current priority is to electrify the fleet and buy renewable energy to charge it. Once that is done, though, the business may find that its AI-powered route optimization software is a substantial source of emissions.

That’s why action is needed now. By building sustainable software today, companies can avoid risky and complex software rework when the issue rises up the priority list.

AI is an area of particular focus owing to its potentially heavy computational load, and as it has become more valuable in real-world applications, the computing power consumed has increased exponentially. Researchers Dario Amodei and Danny Hernandez reported in a 2018 blog post that they had calculated the computing power used in the largest AI training runs had doubled every 3.4 months from 2012 to 2017. Here, a focus on computational efficiency connects well with the wider debate on the most ethical and effective ways to deploy AI, as articulated in “The Imperative for Sustainable AI,” published in 2021 in The Gradient. BCG GAMMA, for instance, already includes social and environmental impact as one of its “6+1” guiding principles for responsible AI.

Of course, digitization and AI remain, fundamentally, forces for good in the fight against climate change. AI helps today’s digital business become more efficient, whether it is decreasing the miles traveled by trucks or the energy needed to heat or cool offices. There is also a burgeoning industry in assessing AI’s “handprint”—the carbon emissions reduced through the use of AI. In this context, AI can be a game changer.

Why Measurement Is a Challenge

Unfortunately, assessing the amount of emissions generated by an organization’s software is difficult:

  • The move to cloud computing, while inherently more efficient, brings additional measurement problems. Companies must rely on the tools provided by their cloud providers to track how much CO2 their tasks have produced, but these tools may make favorable assumptions about energy use. For instance, they may claim to use 100% renewable energy but don’t match their energy needs with renewable energy supply on an hour-by-hour basis. This means in reality that the data centers may continue to be heavily reliant on nonrenewable energy, and the software will still contribute to emissions.
  • Globally, emissions from network equipment are even higher than they are from data centers, according to the IEA study. But network providers have been slow to offer calculators that help organizations measure the emissions generated by their activities.
  • Some company software may be run on hardware owned by customers or other users. The companies will have very little data on this equipment, so calculating emissions will be an even greater challenge.

Cloud computing is also the enabler of the hungriest, most processor-intensive task that most firms undertake: training AI models. This is where AI “learns,” discovering the hidden rules and connections inside a mass of data. A research team at the University of Massachusetts Amherst found that, at the upper bound, training one model could generate 626,155 pounds (about 280 metric tonnes) of CO2. That’s equivalent to 470 people taking return flights from New York to San Francisco, according to the International Civil Aviation Organization Carbon Emissions Calculator.

In a sign of the complexity of measuring software emissions, Google contested the data in this study. The company said in 2022 that optimization and other techniques could produce equivalent training results with 0.00083% of the emissions.

The Impact as AI Becomes Widespread

Although there is a lot of activity assessing the sustainability of training AI, the balance of power consumption will tip as the technology becomes more common. It will begin shifting toward the inference stage, where the AI uses the rules discovered in training to provide valuable results. Google, a significant user of AI, estimated earlier in 2022 that three-fifths of its AI energy use was due to inference and two-fifths for training. This shift in consumption will make calculation of emissions even harder, as inference processing may be done on the user or customer device.

And don’t rely on cheaper, faster processing power, such as specialist processor chips that are highly efficient at AI work, to decrease emissions over time. The IT world may exhibit the Jevons effect, a paradox named after English economist William Stanley Jevons. In 1865, Jevons noted that as coal-fired machines became more efficient, the demand for coal increased, rather than falling as might have been expected. Suppose the Victorian economist were observing in 2022. He might discover the same effect with cloud computing: the improved efficiency has opened a new world of applications, which means total processing demand is soaring. The IEA estimates that energy use by large data centers has been increasing at an annual rate of 10% to 30% over the past several years.

A Plan for Effective Action

So who needs to make the first move on this issue? We argue that the chief executive, backed by the chief sustainability officer, should step up to the plate. They, after all, are responsible for the company’s environmental, social, and governance (ESG) profile. These leaders should take three initial actions.

Open a dialogue with the chief technology officer. CTOs are already under immense pressure to deliver on their company’s digital agenda. They need backing from above if they are to insert extra steps into the development process to reflect an additional focus on curbing emissions. Business units are demanding that their AI projects get delivered, and if CTOs are going to alter timelines to ensure that the software is sustainable, they need higher-level cover.

Many CTOs are already aware of this emerging issue. The nonprofit Green Software Foundation, helped by participation from some of the biggest names in the industry as well as influential users, has been working tirelessly to raise the topic. (Disclaimers: one of the authors of this article chairs the Green Software Foundation’s working group on standards; BCG is a steering committee–level member.)

Empower the team. Our experience is that technologists generally support and understand the need to avoid wasteful software. They like creating technically elegant solutions the first time and don’t relish the idea of reworking software in 18 months to make it more efficient.

Once these professionals are set on the path, they will soon identify the right tools that integrate smoothly into the existing workflow and provide the emissions reductions required. This will ensure that the move to sustainability keeps productivity—and hence the digital agenda—on track.

Ask for numbers. For most companies, disclosure of emissions from the supply chain—such as cloud computing—is not yet compulsory under the Task Force on Climate-Related Financial Disclosures (TCFD) guidelines. Yet these recommendations are becoming the de facto world standard (including a version that may become mandatory for public companies in the US). Even if disclosure is not yet necessary, it’s time to think again and ask for proper numbers. This scope should be as wide as possible, including the impact of your software running on the hardware of users and clients. You also need to take a close look at whether your cloud or data center providers are greenwashing the emissions they create when your software is run.

You may find that your estimated emissions from software turn out to be fuzzy or get revised. That’s a general problem for emissions reporting but particularly so when calculating the emissions caused by software, which is a nascent domain. The problem is most acute when processing is done on shared hardware such as cloud or global networks, where there can be uncertainty about both the emissions generated and the percentage attributable to your company. It is more acute still if the processing is done on the phone or computer of users or clients; you may be able to glean basic information about their hardware but will certainly know nothing about the energy that is powering it.

Standards Would Help, but Don’t Wait for Them

In time, we expect industry initiatives will offer standardization and, better yet, verifiability that will build trust, drive broader adoption, and eliminate greenwashing. The Green Software Foundation has laid the foundations for this with its Software Carbon Intensity specification. This life cycle approach aims to help consumers of software services make smart choices about the carbon impact of the software they deploy. But the numbers don’t have to be perfect for you to start making meaningful changes to the platforms that run your software.

On metrics, it is vital to understand the difference between attributional and consequential accounting methods. (See the exhibit.) Attributional accounting measures all the emissions attributed to a company, including those from its supply chain; consequential accounting measures the total change in emissions caused by a decision without respect for organizational boundaries. This is an important distinction for all accounting of greenhouse gases but particularly for software. To see the difference, consider a change that slightly reduces the load on a central server but requires much more processing from millions of users. Attributional accounting would tally this as an improvement for the developers of the system, while consequential accounting (and the planet) would record this as a deterioration.

Pressure and Incentives for Action

Although tackling software emissions may look like a chore, there are at least two returns on investment.

The first is helping combat climate change. As a species, we collectively suffer the grim impact, from California wildfires to catastrophic flooding and landslides in India. Playing a part in slowing climate change is a good return on investment and is the morally right thing to do.

The second is more mercantile. For years, consumers have been increasing the pressure on companies to address sustainability; this is now becoming commonplace for corporate buyers. A 2022 study by the MIT Center for Transportation & Logistics, for instance, showed investors, senior executives, and corporate buyers all demanding that companies improve sustainability in their supply chain. Firms whose entire business is driven by software, such as software as a service (SaaS) companies, are likely to be first in the firing line here. Companies need to be ready with good answers when they start getting asked tough questions about software emissions.

In time, there will likely be a software equivalent of the certifications used for sustainability in buildings, such as LEED in the US or BREEAM in many other countries. This may focus on AI or may be a broad hallmark applied to a firm’s entire IT operations, like the ISO 27001 standard for information security management.

It’s important to note that managers must not let sustainability concerns halt their AI deployment programs; the business benefits of AI are overwhelming and cannot be put on pause. If your AI project is ready, it should be implemented.


It’s time for organizations to look more closely at the emissions from their software. And if they find these emissions surprisingly high (or growing fast), companies need to act without delay. This applies in particular to firms rolling AI out at scale—which will soon be almost every business. Taking action today means companies will have software emissions under control when customers and other stakeholders start asking difficult questions tomorrow.

Subscribe to our Artificial Intelligence E-Alert.

Subscribe to our Artificial Intelligence E-Alert.