Why the Technology Economy Matters

By Howard RubinChristophe DuthoitHrishi Hrishikesh, and Ralf Dreischmeier

This is the second in a series of articles on technology economics.

Despite technology’s starring role in business and everyday life, many observers openly question whether it has really had much of an impact on the global economy. Their skepticism is misplaced.

As we demonstrated in the first article in this series, technology plays a vital role in boosting company performance. (See “Why Technology Matters,” BCG article, September 2016.) In short, we’ve found that companies with high technology intensity have high gross margins. (Technology intensity is a proprietary metric that analyzes technology spending relative to a company’s and an industry’s revenues and to their operating expenses.)

But if technology is so important, many economists ask, why hasn’t the digital revolution generated the hoped-for increases in traditional macroeconomic metrics such as GDP and productivity? For example, annual productivity growth in the US from 2007 through 2015 hovered at a sluggish 1.3% average rate, half the rate from 2000 to 2007. The US economy experienced three consecutive quarters of falling productivity, from the fourth quarter of 2015 through the second quarter of 2016, the longest slide since the late 1970s.

Critics point to technology’s failure to deliver. The failure may be one of imagination rather than of technology itself, however. As we will show in this article, declines in technology investment are followed by startling drops in macroeconomic growth. In fact, you can see that the technology economy has close relationships with GDP, productivity, and other measures of economic health—if you look closely.

The Effect of Technology on Economic Growth

For years, economists have cast doubts on the importance of technology to economic growth. The apparent powerlessness of new technologies to improve productivity has become known as the Solow paradox, named after Nobel Prize–winning economist Robert Solow. “You can see the computer age everywhere but in the productivity statistics,” Solow said in 1987.

In his book The Rise and Fall of American Growth: The U.S. Standard of Living Since the Civil War, economist Robert Gordon argued that the new technologies of today are not as world changing as were, for example, electrification, cars, and wireless communications during the second Industrial Revolution. Others argue that information technology could be at a stage of development in which its potential impact has not yet revealed itself, just as early-20th-century inventions, such as electric lighting, failed to immediately lift the slow productivity growth that prevailed after their introductions.

John Fernald, a leading expert on productivity at the Federal Reserve Bank of San Francisco, has determined that the recent slowdown in productivity was not connected to a host of factors, including housing, educational attainment, capital intensity, and the Great Recession that started after 2007. Technology itself was the cause. As industries reorganized after the internet explosion that began in the mid-1990s, the potential for transformative gains from technology shrank dramatically. Fernald recently asserted that, in general, measurement errors are not to blame and free internet services have a negligible impact on the economy.

Erik Brynjolfsson and Andrew McAfee, codirectors of the MIT Initiative on the Digital Economy, argue more optimistically that productivity increases associated with new technologies happen only after a long period of time, when technologies become powerful and cheap enough for their truly transformative powers to kick in. Others say that we’re only just now beginning to see the transformative potential from recent innovations such as big data, artificial intelligence, advanced robotics, nanotechnology, and biotechnology. Robert Atkinson, of the Information Technology & Innovation Foundation, argues that over the next few decades, the US may see productivity rise to perhaps 3.0% to 3.5% per year—as much as a percentage point higher than the relatively rapid pace of 1995 through 2007—once transformative technologies such as these come into wide use.

In general, we agree with a more optimistic line of reasoning about technology, but we have reached a different conclusion. We think that in many cases, traditional measures of economic growth don’t take into account important benefits of technology and are less relevant to prosperity than they were in a mass-production world. For example, GDP, an important factor in the calculation of productivity, fails to capture many technology-generated improvements in living standards. These benefits include the greater convenience and better customer experience provided by digital services and the vast amount of information—such as online maps, search results, and social media—available for free and with zero marginal distribution cost. Measurement flaws such as these could partially explain why productivity growth has been so slow over the past few decades, at least according to current metrics.

Rather than seeing technology as having a marginal effect on productivity, we have found a strong relationship between technology spending and economic growth as measured by productivity and GDP. For example, executives can predict with some accuracy the impact on the overall economy of a decline in technology spending. Whenever companies cut back on discretionary spending in order to shore up profits during a downturn, they slash their investments in technology. Soon afterward, GDP falls dramatically, and, within a few years, labor productivity across the economy falls. (Remember that technological innovation is an important component of productivity.)

The drop in technology intensity that results from a decline in technology spending causes the labor force to shrink, which shows up in productivity up to three years later because productivity is a “stickier” measure. Exhibit 1 shows the relationship between technology intensity and GDP. (A similar pattern exists for productivity.)

The global economy is showing other signs of this effect, as Mary Meeker, a general partner at Kleiner Perkins Caufield & Byers, recently highlighted in her influential 2016 Internet Trends report. Global GDP growth has been lower than the 20-year average in six of the past eight years. As GDP comes under pressure, global growth in the use of technologies such as the internet and smartphones has slowed. This downward cycle reduces new opportunities for productivity and GDP growth.

One likely explanation for the past decade’s slowdown in productivity, as reflected in the official statistics, could be that economists and business leaders are not tracking the metrics that make the impact of technology most evident. It could be that to see the economic lift from technology, which centers on digital information, we need to look elsewhere than the traditional economy, which centers on the physical, Industrial Age world.

Measures such as the price of cloud storage, data processing rates, broadband speed, and 21st-century skill development could be more relevant. That requires a shift in thinking about how we invest in technology and how we measure its macroeconomic effects.

Investing for Critical Mass

In addition to arguing that existing metrics have failed, we maintain that the slowdown in productivity also signals a failure to reach a critical mass of technology. Despite rapid growth in spending and technology’s significant impact, the level of technology intensity at the world’s companies fell steadily from 2005 through 2015. Yes, you read that right: despite record spending on technology, technology intensity is plummeting.

While technology expenses are rising faster than the revenues that result from those investments, operating expenses are rising even faster. (The higher ratio of technology spending to revenues in the calculation of technology intensity is offset by a disproportionately lower ratio of technology spending to operating expenses. See the first article in this series for a discussion of how technology intensity is calculated.) In effect, companies are getting less and less for their significant investment in technology.

This counterintuitive trend—companies are spending more but getting worse results—is the paradoxical result of some companies’ failure to spend enough on technology. Most companies spend about 5% of revenues on technology, which is not a staggering amount relative to other important expenses. In fact, the pendulum is once again swinging back toward a major slowdown in technology spending. Major banks, normally heavy spenders on IT, have announced 25% to 30% cuts in technology expenditures. IDC forecasts that global spending on IT is set to grow by only 2%, after growth of 5% to 6% over the past five years.

The situation is akin to delivering a vaccine that works for only 10% of the population because the company didn’t test enough variations of the vaccine. The company saved money, but that choice had negative repercussions.

What if, when things weren’t going well during the Industrial Revolution, companies had cut down on machines, automation, waterways, or electricity? Today, many companies are cutting back on a critical investment that could power the next wave of growth. In many cases, that investment could create huge leverage—lowering other expenses through automation, for example—much more quickly than technology spending rises. But that can happen only if companies manage their technology spending well. (We will explore this topic in greater detail in the next article in the series.)

New Ways to Measure Success in the Global Economy

If we’re looking in the wrong places and, paradoxically, not spending enough on technology, how can we gain a better understanding of the technology economy? We maintain that businesses can learn to think about economic growth in new ways, as well as develop new macroeconomic measures that highlight the impact of the technology economy.

A more nuanced way to think about productivity involves focusing on technology’s ability to increase reach and generate leverage. For instance, the internet enables companies to reach millions of potential customers, magnifying the results of their investments. Social networking services such as Twitter and Facebook change the productivity of reach: the incremental cost of reaching 3 million instead of 1 million people is zero. In addition, automation allows companies to replace labor-intensive manual processes with algorithms.

One day, executives will be able to measure the rise in productivity resulting from innovations such as self-driving vehicles and nanorobots. In the more immediate arena of IT-enabled health care, we are already starting to measure technology’s contribution to health care productivity. Better-trained physicians are able to make diagnoses more quickly and accurately. In other words, they are increasing their labor productivity, and this improvement—even if it doesn’t currently show up in official productivity statistics—leads to better health outcomes. For example, thanks in part to recent efforts to increase health care efficiency and institute value-based health care, inflation-adjusted Medicare spending per beneficiary has declined over the past few years, after years of rapid increases.

To keep bending the cost curve downward and thereby improving the productivity of health care overall, we need to take a fresh look at increasing efficiency. The Dell Medical School at the University of Texas at Austin is at the forefront of such efforts to improve health care productivity and outcomes: its curriculum aims at training doctors to navigate a collaborative, data-driven, results-oriented world.

Another area of productivity-related inquiry focuses on metrics of global labor costs. Thanks to the globalization of manufacturing and many other industries, companies have circled the globe looking for rapidly developing economies with low average wages. Now they are discovering that in terms of output per dollar of wages and other new measures, these low-wage countries may not have an advantage over a high-wage country with high levels of automation. (See The Shifting Economics of Global Manufacturing, BCG report, August 2014.) Among other factors that affect output, the economic impact of each dollar in wages could be far greater owing to technology. As a result, output per dollar of wages is a much more revealing metric for decision makers than a country’s average wages.

In addition to productivity, executives need to watch a macroeconomic measure that shows “flows” in the technology economy: the technology balance of trade, or the technology services exported per dollar imported. India, for example, exports $8.86 in technology services per dollar imported, while the US exports only $0.84 in technology services per dollar imported. (See Exhibit 2.) Understanding flows such as these helps companies identify promising markets and do a better job of predicting economic growth in the technology economy.

Ultimately, these new ways of thinking about and measuring economic growth point to the need for new ways to discern whether companies are successfully navigating the technology economy. The first article in this series describes critical company-level metrics that measure the state of the digital world. Executives will also need to create, measure, and track virtual macroeconomic measures—and do that just as carefully as they work with metrics about the physical world. And they must adapt to changes in these indicators in near-real time. But to truly succeed, senior leaders must understand where they stand in relation to competitors—and act on that knowledge.

In the How to Reach the Technology Economics Frontier in this series, we explore a new way to gauge success in the technology economy.