Massimo Russo

Senior Partner & Managing Director

Boston

By Massimo Russo, Anant Thaker, and Suhare Adam

Quantum computing is not a replacement for the binary classical computing that has become a staple of modern life. But to paraphrase Nobel laureate Richard Feynman, because quantum computers use quantum physics to emulate the physical world, they can solve problems that today’s computers will never have the power to tackle.

Not everybody needs this capability, but the use of quantum computers to model physical systems has immediate applications in industries such as pharmaceuticals, chemicals, and energy. Algorithms using quantum math can unlock value by vastly speeding up data-intensive applications in such fields as search, cryptography, and machine learning. In the future, hybrid systems consisting of classical computers that call on their quantum cousins for assistance will solve problems that are intractable today.

There’s a long way to go: a quantum simulation needs about 150 logical qubits, each of which consists of anywhere from ten to thousands of “physical” qubits, which are required for error correction and stability. As John Preskill of the California Institute of Technology pointed out earlier this year, today’s quantum processors use noisy physical qubits with limited capability and a penchant for errors. They should be regarded as a stepping stone to more capable technologies; their primary near-term value is to provide a platform for the development of quantum algorithms and applications that will be useful over the long term.^{1} More comprehensive metrics to measure quantum processing’s progress are also needed. One example is IBM’s “quantum volume,” which provides a good assessment of a quantum computer’s processing capability until such systems grow to demonstrate multiple logical qubits.

Major players are on the case. IBM recently announced a 20-qubit quantum processor and a simulator that can emulate up to 49 qubits, only to be outdone by Google a few months later with its Bristlecone chip, a 72-qubit processor. Other big tech companies and research institutions, including Intel, Microsoft, MIT, Yale, and Oxford, are active in the field.

Here we offer a guide for how business executives can think about quantum computing and its applications. We explore a likely timetable for development, several high-potential early applications, the current state of the technology and potential models for adoption, and the steps companies can take now to prepare for the advent of quantum computing.

You might be interested in

The Qubits Are Coming

Explore the infographicNotes:

1.

John Preskill, “Quantum Computing in the NISQ era and beyond,” January 27, 2018.

There are two primary prerequisites to practical business applications for quantum computing: processors with enough qubits to run quantum simulations and quantum algorithms that solve the mathematical problem underlying the application. Several such algorithms, in fields such as cryptography and machine learning, already exist. The processors are under active development, and announcements of increasingly capable processors come at an accelerating pace.

**Significant Speed Advantage.** Because classical computers work sequentially, they are impractical for tackling very large or complex problems. For example, there is no known solution to factoring large numbers into their primes; computers simply have to guess through trial and error, and the number of tries grows exponentially with the number of digits. Quantum computers, in contrast, attack problems concurrently; in effect, they consider all possible solutions at once and discard the ones that don’t work. For certain problems, the solution run time for a quantum processor grows linearly rather than exponentially with the number of dimensions, which creates an enormous speed advantage. Prime factorization of a large number is one exponential computational problem that can be solved in practical amounts of time using a quantum approach (and a specific math solution known as Shor’s algorithm).

One application for this significant speed advantage with substantial near-term market potential is in pharmaceutical and chemical R&D: simulating the interaction among molecules as they grow in size, since they exhibit exponential growth in solution complexity similar to prime factorization of large numbers. Consistent with Richard Feynman’s vision, quantum processors can consider all possible interactions at once and arrive at a molecule’s lowest energy state, which will represent how molecules interact. We estimate that quantum simulations could represent an addressable market in pharmaceuticals of up to $20 billion by 2030 with another $7 billion coming from chemicals, materials science, and other materials science–intensive industries.

**Moderate Speed Advantage.** The time it takes to solve problems involving unstructured search, including those critical to machine learning applications, also increases exponentially with problem size. Quantum math solutions, such as Grover’s algorithm, promise a moderate speed advantage (in proportion to the square root of the problem size) for unstructured search. Today, large-scale search and machine learning problems are addressed through massive, parallel, specialized graphics processing units, or GPUs, produced by companies such as Nvidia. We expect a market of more than $20 billion in search and machine learning applications to develop as quantum computing methods displace GPU-based platforms. This potential is likely behind Google’s and IBM’s interest in search-optimizing quantum computing platforms.

**Uncertain Speed Advantage.** Classical computing approaches today adequately address problems involving complex operations or networks—for example, route optimization in transportation and logistics. Quantum computing methods could offer a speed advantage beyond a certain threshold of problem size, but the companies we talked with in our research consider current computing methods to be sufficient. It is not clear today if quantum computing could unlock significant new value in the future.

You might be interested in

The Qubits Are Coming

Explore the infographicThe technological challenge, in a nutshell, is this: solving specific problems using quantum algorithms requires sufficient scale of quantum computational power. This is represented by the number of logical qubits (which are loosely equivalent to the number of bits and memory in a traditional processor) and the much higher number of physical qubits to handle error correction (more on this below).

On the basis of these assumptions and the current starting points, we expect the quantum computing market to evolve over three generations. During the first, which covers 2018 to 2028, engineers will develop nonuniversal quantum computers designed for applications such as low-complexity simulations. Much of the development of these computers will take place in the next few years, and they will be in use until the second generation arrives.

The second generation (2028–2039) will be the period in which quantum computers scale up to 50 logical qubits and achieve “quantum supremacy” over classical computing—meaning that they will be able to perform certain algorithms faster in specific applications. This second generation of quantum computing will focus on such problems as molecular simulation, R&D, and software development. During this period, usable applications will come to market, creating significant value. At the same time, quantum information processing will develop further as a field, and companies will become more familiar with quantum simulation methods.

In the third generation (2031–2042), quantum computers will achieve the scale to perform advanced simulations for commercial use in simulation, search, and optimization with significant advantages over classical methods. Because of the scaling of Moore’s law, and the thresholds at which quantum computing overtakes binary computing in certain applications, there is considerable overlap between the second and third generations. As a general trajectory, we expect a decade of steady progress in quantum computing followed by a significant acceleration after about 2030.

While the biggest potential for quantum computing is more than a decade away, business leaders should monitor the first generation of development—particularly the next few years. During this time, we expect companies in industries such as chemicals to experiment with limited quantum computing applications in the modeling of relatively simple molecules and in specialized optimizations. These companies will familiarize themselves with quantum computing methods and tools through hands-on use. IBM and Microsoft are both developing quantum computing communities, quantum computing simulators, and easy-to-use tools that expose developers to quantum processing techniques. As quantum algorithms, programming languages, and quantum-processors-as-a-cloud services become available, developers will gradually incorporate them in software solutions. Hybrid computing systems combining classical and quantum approaches will emerge. This period of learning will be critical to increasing awareness and experience so that once quantum supremacy is achieved in certain fields, adoption can proceed quickly.

In the US pharmaceutical sector alone, if quantum simulations of complex atomic processes were available today, and 10% of companies were willing to pay for the capability, quantum computing would represent a $15 billion to $30 billion addressable market opportunity. Currently, the market for all high-performance computing globally is $10 billion.

There are other practical applications. Quantum computing can be applied to accelerate search and machine learning algorithms used with exponentially growing datasets. This will become increasingly important to unlocking the value of data, as the tens of billions of devices in the Internet of Things, for example, drive the volume of available data into the stratosphere.

For some classes of problems, the search for a solution requires trial and error and the simultaneous testing of potential solutions. Imagine an archipelago of thousands of islands connected by bridges and the need to find a path that crosses each island only once. The number of possible solutions rises exponentially with the number of islands, but checking that a given path satisfies the constraint of sole island visits is straightforward. If our hypothetical island puzzle had 1 million possible solutions, a binary computer would require an average of 500,000 tries to find the right one. A quantum processor running Grover’s algorithm would solve the problem in only 1,000 attempts—500 times faster.

Quantum computing’s power comes from the fact that it is a fundamentally distinct technology from the binary, Boolean logic–based computing we are all used to. There are three essential differences. The first has to do with the bits. Binary computers use binary bits: everything is based on 1s and 0s or, as some like to think about it, on or off. Picture a light switch, which has only two positions. Qubits, on the other hand, can inhabit states of 1 and 0, or on and off, at the same time. The light switch is instead a dimmer with a theoretically infinite number of settings. Qubits are about probabilities rather than black-or-white certainties, and this is simultaneously a big enabler and a substantial problem (more on this below).

The second difference is that binary computers keep all those 1s and 0s separate; they have to in order to run their calculations. Quantum computing works on the purposeful entanglement of qubits; by manipulating one, you simultaneously manipulate all of its entangled mates. Adjusting one light dimmer affects all the others in the room—and all the others in the house. This is what gives quantum computing its calculating prowess.

The third difference lies in the way that quantum computers do their work. While binary computers conduct massive numbers of arithmetic calculations sequentially, a quantum computer calculates all possible outcomes concurrently and settles on a potentially correct answer through constructive interference; it “cancels out” all the wrong answers. In the example of the bridges connecting the islands, a quantum computer would simultaneously consider all potential routes and settle on one that is probabilistically “correct.”

From a practical engineering standpoint, quantum computers have real constraints. Quantum circuits work best at very low temperatures—near absolute zero. Quantum states are highly unstable; any outside influence increases the chance of error, which is why they need to be super-cooled and isolated. Qubit stability, or coherence, and error correction are major issues—indeed, as machines get big enough to do useful simulations, the ratio of physical qubits (required for control and correction) to the qubits doing the actual work can be as high as three thousand to one. For these reasons, quantum computers require significant surrounding infrastructure and resemble old-style mainframes in large, climate-controlled data centers (just a lot colder!) much more than they do today’s laptops or smartphones.

Today’s quantum computers are in the very early stages of invention, not unlike classical computing in the early 1950s, when William Shockley of Bell Labs invented the silicon-based solid-state transistor that replaced the vacuum tubes powering the earliest computers and set the tech industry off on the pursuit of ever-more minute and powerful processors that continues to this day.

We expect the market for quantum computing to be distributed across a technology stack with multiple layers—but we don’t expect to see the hardware-to-software value migration that classical computing has experienced.

The hardware layer will include components and qubit arrays using potentially competing technologies (such as trapped ion and superconductor). Intermediary layers will include compilers and, critically, quantum error correction, and an input-output information processing system for stable computing and inferring quantum computing results.

The software layer will include software development kits and application programming interfaces (APIs) to provide quantum processing functionality. We expect software solutions (chemical process modeling, for example) to be a hybrid of quantum and classical computing solutions, with quantum computing services used in a manner similar to that of specialized coprocessors today, which address specific computing loads.

In classical computing, value migrated from hardware to modular components and software as processing power grew, devices shrank, and the cost of critical components such as memory plummeted. We believe quantum computing is less susceptible to modularization because of the narrow computing applications, the hardware’s need for cooling and isolation, and the highly integrated architecture that is necessary for error correction and control. Most companies will likely access quantum capabilities through a cloud-based infrastructure-as-a-service model and APIs specific to quantum algorithms.

Multiple business models will emerge in quantum computing, each involving different layers and combinations of the stack. The main models will include the following:

**Integrated Hardware as a Service.**System manufacturers offer computing capacity through infrastructure as a service, which is accessed through APIs and quantum computing programming interfaces. IBM and Microsoft are currently pursuing this model.**Hardware Unit Sales and Service.**Manufacturers sell fully integrated systems to major end users (such as government agencies and large pharmaceutical companies), potentially coupled with ongoing managed- services contracts. DWave is pursuing this model.**Hardware-Agnostic Software Applications.**Developers create applications for end users (pharma and materials science companies, for example), and users access hardware through hardware-as-a-service providers.**High-Value Components and Services.**Manufacturers offer critical components that enable the physical environment for, and control of, quantum computers, possibly to support multiple qubit technologies.

Quantum computing won’t be for everybody. But if your company is in a data-intensive field or an industry in which the ability to run simulations of complex real-world functions and interactions in a practical amount of time advances R&D, you’ll want to start engaging with this advanced technology. Already, BASF, VW, Airbus, and other companies are investing in building quantum computing capabilities. A good first step is to launch an initiative to build an understanding of quantum algorithms and gain experience using quantum computing platforms and tools provided by IBM, Microsoft, and others. Emerging software development and consulting companies such as QxBranch, QC Ware, and 1Qbit are working in multiple industries to develop quantum applications. Companies may also want to consider sponsoring academic research in quantum applications. IBM, for example, is working with MIT on an initiative on AI and quantum computing.

Pharmaceutical companies and others dependent on materials science innovation should begin to explore molecule simulation using quantum processors. (IBM has accurately modeled the largest molecule to date, Beryllium hydride, or BeH2, using a scalable method on a quantum computer.) They should also challenge their R&D organizations to follow quantum computing breakthroughs, especially as they accelerate. Companies leveraging search, neural networks, and optimization algorithms should encourage their data scientists to learn quantum algorithms and approaches and to study how quantum processors could significantly accelerate their capabilities. As with other advanced technologies, such as AI and machine learning, the companies that position themselves to take advantage of quantum computing early will establish a significant advantage.

One note of caution: quantum computing has potentially significant implications for cryptography and encryption. Because current encryption methods often rely on the prime factorization of large numbers, quantum computing’s ability to factor these numbers within practical time frames is a potential (if long-term) threat to keeping messages secure. While the number of logical qubits required (more than 1,000) suggests that quantum encryption-cracking computers will not be practical before about 2040, companies should watch emerging quantum-proof encryption methods and be ready to shift away from a dependence on prime factorization methods, especially for critical applications. Already countries such as China and the US are investing heavily in quantum research for secure communication, with China launching the first satellite dedicated to implementing a quantum communication channel.

Quantum computing is moving quickly from research labs to real-world applications. It has the potential to unlock significant value for companies in the next decade. Executives need to start watching now for key milestones indicating that quantum computers are approaching supremacy, and companies that want to capitalize need to start building internal capabilities to take full advantage of the strange but awesome computing powers quantum processors can provide.

SUBSCRIBE

EN

BCG uses cookies to improve the functionality, performance, and effectiveness of our communications. Detailed information on the use of cookies is provided in our Cookie Policy. By continuing to use this site, or by clicking "I agree," you consent to the use of cookies.