Subscribe to our Public Sector E-Alert.

" "

Right now, demand for the advanced chips needed to power AI is skyrocketing. In response, many leading technology companies are unveiling new models of AI chips.

This month, Intel, Meta, and Google have all announced their latest AI chips.

And Nvidia showcased its new AI chip at its annual developer conference in March, promising super-fast processing speeds. The semiconductor company is a leading manufacturer of the graphic processing unit (GPU) chips needed for large language models, and this year has become the third largest public company in the world by market capitalization.

The So What

“Chips are the building block of a new type of economy. No matter what industry you are in—consumer goods, health care, shipping—the business processes are becoming AI-enabled, says BCG Managing Director and Partner Suchi Srinivasan, an expert in AI in the enterprise.

“Everyone needs a supply of AI chips. And the opportunity to supply them is vastly larger than any one company can realistically fulfill. So there’s plenty of room for new players, and the ecosystem is evolving rapidly.”

Many large technology players and cloud providers have already locked in supply from chip providers. They are also ramping up house efforts to create their own chips so they have more control over the value chain and to introduce greater innovation.

  • Working from standardized chips means it is harder to create differentiation.
  • Relying on a single source for AI chips leaves tech players exposed on price and supply.

The innovation cycle around AI chips is only just beginning, and GPUs are just the start, Srinivasan says. As AI workloads mature, we will see the emergence of more specialized and therefore efficient architectures to address new workloads.

And the pressure on cost will increase further as enterprises move from model development to widespread deployment, increasing the need for alternative supplies to help keep prices in check.

Now What

These are some of the factors to consider about future market dynamics for AI chips.

In the short term:

Many of the big technology players have already pre-bought a supply of AI chips for the next few years. And chips are becoming more efficient, meaning that fewer are needed.

But there is also likely to be a mismatch between supply and demand. Some software companies have pre-ordered hardware before they have fully developed the software applications to fully utilize it, while other software firms may find themselves unable to secure the supply of GPUs they need if their applications are scaling and require more chips than expected.

Pricing will be inflexible while the choice of suppliers is limited. In time, new pricing structures and licenses are likely to emerge, as buyers move away from legacy token-based systems that can be difficult to navigate.

In the longer term:

Many countries, including the US, the EU nations, and India, are investing heavily in new semiconductor facilities to build the wafers which form a base for the advanced chips.

However, they will take many years to come onstream.

And the rapidly changing market for AI presents both new opportunities and could create some bottlenecks:

  • Advancing Chip Technologies. These include the advanced packaging techniques that allow multiple chips to be stacked on top of each other to accelerate processing speed. Other technologies include high bandwidth memory technology that allows for faster transfer of data within chips.
  • Deployment Challenges. Choke points could also emerge in chip deployment rather than creation. For example, in the speed of building the new datacenters to deliver the computing power and data storage needed for AI applications. Availability of power will also be key to facilitate the expansion of those datacenters.
  • Shifting Requirements of AI Builds. As the rollout of AI use cases continues, there will be a shift in demand from AI chips that help train LLMs on data, to those that are needed to make predictions based on live data.

“The market is hungry for viable and competitive options to supply AI chips, but it remains to be seen how fast alternative sources of supply will be able to offer the desired price, sophistication, and scale,” says Srinivasan.

“However, enterprises can’t afford to wait for greater certainty about the chips market before experimenting with AI. Companies need to jump in and start iterating now in order not to be left behind in the race to create value.”