Philip Evans, a BCG Fellow and senior partner in the firm’s Boston office, focuses on the strategic implications of changing information economics. He is a coauthor of Blown to Bits, one of the handful of books from the dot-com era that are still worth reading today.
You’ve devoted much of your career to exploring how the evolution of digital technologies is disrupting the rules of business strategy. What is the connection between technology and strategy?
The relevance of technology to high tech companies is obvious. But even for low tech companies, strategy ideas are based on assumptions about technology.
The discipline of business strategy was founded by two intellectual giants: BCG’s founder, Bruce Henderson, and Harvard Business School’s Michael Porter. Henderson posited increasing returns to experience and scale, which led to the “Napoleonic” concept of concentrating mass against competitors’ weakness. These economics of mass depend on fixed-versus-variable costs, learning, and network effects, which are all grounded in technology. The experience curve, for example, was an early and more general statement of Moore’s law.
Porter pictured a business as a vertically integrated value chain of sequential and heterogeneous activities bonded together to economize on transaction costs: the costs of coordinating and aligning. Since transaction costs depend on communication economics, standards, uncertainty, and asymmetries of information, they, too, are grounded in technology.
So how is technology changing strategy?
At the broadest level, technology is driving down the costs of information processing, storage, and communication—the three big exponentials. In the Internet era, this has translated into three waves in the transformation of business strategy.
In the first wave, which started around 1995, technology drove down transaction costs, breaking up value chains and driving disintermediation and the collapse of some distribution-dependent businesses. Strategists could no longer take their value chains as a given: they had to make hard choices about which pieces to protect, which to abandon. Suddenly they found that they could repurpose their advantaged assets to attack previously unrelated businesses.
That was “deconstruction” as you and Tom Wurster described it in Blown to Bits. Isn’t that an old story?
Yes and no. I remember giving a speech to the Newspaper Association of America in 2000 predicting the deconstruction of the newspaper industry. I envisioned the rise of focused competitors that would fundamentally disrupt the economics of the newspaper business by picking off the most profitable lines of business, such as classifieds. A famous newspaper magnate came up to me afterward and essentially told me that I was “full of it” (his exact phrase is not publishable). His counterargument was that the industry was enjoying its best year ever precisely because of the ad revenues coming from the dot-com boom. With regard to the short term, he was clearly right, and I was overstating my argument. But today we have all seen the devastating impact of craigslist.org, for example, on the newspaper industry’s profitability and viability.
As Bill Gates famously commented, we tend to overestimate the speed but underestimate the ultimate impact of disruptive technologies. The deconstruction story is far from over.
What are the other two waves?
The other two relate to Henderson’s economics of mass—which are polarizing. In some contexts, activities that used to be part of an integrated value chain exhibit increasingly negative returns: “small is beautiful.” Individuals connected by cheap communication technology and simple social platforms prove that they can collectively do things better, or at least cheaper, than conventional corporations. Self-organization, community collaboration, and a mixture of profit and non-profit motivations substitute for the conventional corporate business model—in the purest cases, Linux, Firefox, Second Life, and Wikipedia. More narrow—but no less potent—is the exploitation of user- and consumer-created comments, reviews, and hacks by Amazon, Facebook, Netflix, and SAP.
Essentially Web 2.0...
Exactly. The second generation of the commercial Internet was about the polarization of economics of mass in favor of the very small, resulting in the fragmentation of key layers of the traditional value chain. Individuals took the place of organizations, consumers became producers, and one-way broadcast was supplanted by two-way dialogue. Linear value chains began to look in many cases more like networks. And people weren’t just chatting, they were doing stuff. Some companies—such as Microsoft, and those in the music industry—fought this, at least initially, with every available weapon. Others embraced it: IBM started investing in Linux, for example. Google published hundreds of application programming interfaces, making Google Maps, for instance, freely available for every small business’s website. Consumer goods companies and retailers discovered that their best marketers are their own customers.
But in addition—and marking, perhaps, the cusp of a third generation—some economics of mass have polarized in the opposite direction, toward massively increasing returns: “big is beautiful.” The most obvious example is big data. Not only is the world’s stock of information exploding, but we are increasingly able to connect the pieces together. There have always been increasing returns for statistical inference: larger databases yield more robust estimates and finer discrimination. Until now, the problem was that we couldn’t gather the data, array it, and do the sums. But thanks to the Internet of things, IP networks, and the hardware and algorithms of cloud computing, each of those three constraints is receding. So inference can now scale. The key strategic implication is that data, and the inference based on data, now scale beyond the traditional business definition of the individual corporation. Just as user creation drives the fragmentation of some activities, so big data drives the consolidation of others. The strategic question then becomes, what combination of data gathering, data alliances, cooperative data sharing, and trusting relationships with data subjects will create an advantage in inference?
And we’re not talking about just data: there are many other domains with the same characteristics, where the technology is scaling beyond the current business model.
But stay with big data for a moment. Isn’t the real challenge how companies can exploit it within their own four walls? Isn’t the challenge operational?
Of course—today. Technology always starts by looking like, well, technique. Cars start by looking like horseless carriages. The strategic implications only become manifest after enough users have mastered the practical challenges and refocus on gaining real advantage. That will happen with big data after the (brief) interlude it will take for staff to master Alteryx and Hadoop.
Consider genomics. Within a couple of years, we will be able to map a person’s genome for $100. In effect, genomics goes retail. It will shift from a research activity, in which researchers map representative genomes, to a clinical one, in which doctors map individual patients’ genomes. Medicine will be revolutionized, and the revolution—as Bill Gates might observe—will be bigger and slower than people expect. Big-data techniques will be used to see fine-grained patterns among individual genomic data, medical records, outcomes, real-time data from body sensors, and ambient data from the environment. Medicine will advance by decoding immense, cheap, noisy data sets instead of the small, expensive, clean data sets generated by clinical trials and laboratory experiments.
But...who owns the data? And how can it be aggregated when providers, insurers, device companies, pharmaceutical companies, Google, patients, and governments not only possess different pieces of the data elephant but guard them jealously and compete on their information advantage? How are privacy and patient rights going to be protected when the very specificity and granularity of the data make reidentification mathematically trivial?
We are going to need different kinds of institutions—trusted, neutral data repositories that together serve as a common large-scale infrastructure for the health care industry. It is already happening. For institutional and political reasons, it will happen more quickly in some jurisdictions than in others, but with medical inflation swallowing the entire economic growth of advanced economies, the process cannot be stopped merely because it is institutionally inconvenient. The data of big data will become infrastructure and will have to be managed as such. Responding to that is strategy.
So how do the pieces fit together?
Well, falling transaction costs permit value chains to break up. Polarizing economies of mass, however, require them to break up. Functions displaying diminishing returns fragment, quite possibly to the limiting case of self-organizing communities of user-producers. Functions displaying increasing returns, however, consolidate, quite possibly to the limiting case of an infrastructure monopoly.
There are huge potential gains for the business system because user communities are advantaged in certain kinds of distributed innovation and monopolies are advantaged in the efficient utilization of fixed assets. Both flourish in a symbiotic relationship, so the business system can finesse the traditional trade-off between static and dynamic virtues, between efficiency and innovation. Instead of oligopolistic competitors with similar value chains, we have an architecture of horizontal layers. Each layer operates in its own particular mix of markets, hierarchies, and communities, and of competition and collaboration. I call the collection of layers that compose any particular economic system a stack.
Isn’t the term “stack” common parlance in technology?
Yes, of course. A technical stack is a modular, layered, highly interoperable architecture, in which lower layers provide services to upper layers, but the reverse is not the case. Lower layers are designed for generality, simplicity, and efficiency. Upper layers are designed for customization, and they permit experimentation and innovation. Stacks have long been recognized as the technical solution to the problem of designing very complex systems that need to be simultaneously efficient and adaptable.
My point is that technology architectures are templates for business architectures. So whole industries will get stacked—often in contexts far removed from high tech.
Can you give us a low tech example?
The electric-power industry used to be a vertically integrated, regulated, municipal or regional business characterized by a one-way value chain whose stages were energy generation, long-distance transmission, local distribution, and retail consumption. Prices were set once a year by a regulatory authority. In many jurisdictions, generation was spun off as a separate, competitive business, but that was merely a prelude. In the next decades, distributed generation from renewable sources will increasingly be done by consumers themselves. And because renewable sources are intermittent, the timing of consumption and the economics of storage and of backup power sources all become key.
Consumers will need smart devices in smart homes to manage consumption. Their electric cars will become a major factor in storing power and then, when there is some left over, selling excess amp-hours back to the grid. Massively expanded long-distance transmission networks will be needed to arbitrage supply and demand geographically. The power system has to become two-way, switched, and interoperable, less like the traditional municipal value chain and more like the Internet.
But all this requires pricing mechanisms that correctly reflect the long-term marginal costs of each service in the system: power, whether bought or sold, needs to be priced at its instantaneous wholesale price; local distribution services need to be priced on the basis of the capacity provisioned, not the kilowatt hours consumed; and so forth. With the right pricing mechanisms in place—which is not the case today—these become attractive investments. Smart homes and smart grids are worth investing in when prices fluctuate to reflect the true instantaneous cost of power. The electric car comes close to being cost-competitive when it incurs a negative cost of fuel—when the consumer can buy cheap power in the middle of the night and resell what is not used at high prices during the peak hours.
So, much of power generation and storage will fragment and be done by consumers, transmission will consolidate into continentwide cooperative networks, and local distribution will remain a natural monopoly defined by cables and conduits. New business layers will emerge: unregulated markets for wholesale power, information services for managing the smart home, capacity for providing backup power, and financial instruments for shifting and mitigating the risk. These layers will have very different risk profiles and will therefore be financed in very different ways. A vertically integrated, regulated, local business will simultaneously fragment and consolidate: different layers will variously pursue decentralized innovation and customization on the one hand, and centralized efficiency and scale on the other. The industry will become layered and horizontal.
But this sounds like industrial policy rather than business strategy!
I think the boundary between the two is breaking down. When strategy is about curating the layered architecture of interoperable business activities, then public policy, such as regulatory rate setting, becomes just a special kind of strategy—not much different from Google Android curating an ecosystem of device companies and application developers. Bear in mind that over the next decade or so, the transformation of the energy industry I describe warrants a half-trillion-dollar investment for a two-trillion-dollar return. Done right, this is a huge business opportunity.
Regulators and business leaders will need to master this new and common concept of strategy. But on balance, I suggest that a stacked industry will be less regulated than today’s industry is.
Is there a larger theme here?
Capitalism is in crisis. Everyone craves growth, but nobody is investing. That is not because people are timid or dumb; the returns, as they see them, simply are not there. But if and where the logic I describe does apply, then the returns are there—they are merely locked in market failures and misaligned industry structures. Whether or not that is indeed the case is an empirical, industry-specific question. Why, if electric cars have negative fuel costs, don’t more of us drive them? Why, if broadband is fundamental to economic competitiveness, is investment in fiber optic networks slowing down? Why, if we are on the cusp of a genomic revolution, do big-pharma companies talk about the collapse of R&D productivity and spend their profits on mutual acquisition?
The full answers to these questions are complex of course, but the starting point is that we need to set ideology aside and look at how the industry actually works. In a world of oligopolies beset by polarizing scale and falling transaction costs, we cannot assume that an “invisible hand” will do the hard work for us. This emerging kind of capitalism is not necessarily self-correcting. Though when you think about it, if capitalism were entirely self-correcting, there would be no need for business strategy.
At a Glance
Born in Plymouth, England
Year born: 1950
Graduated with double first-class honors in economics from Cambridge University, winning two University prizes; earned an MBA with honors from Harvard Business School; and was a Harkness Fellow in the economics department at Harvard University
Authored or coauthored many publications, including Blown to Bits—the best-selling book on technology and strategy in 2000—and four Harvard Business Review articles; “Strategy and the New Economics of Information” won a McKinsey Prize
Board member of the Oxford Internet Institute and member of the British–North American Committee