Many steel manufacturers haven’t yet embraced AI, but that doesn’t mean they can’t. All that’s required is an openness to experimentation and a willingness to change.
A wide range of industrial companies are creating value by integrating AI into their operations, but steel manufacturers still lag behind in their adoption of the technology. That is a missed opportunity that will continue to worsen for the metals and mining industry. BCG GAMMA recently worked with steel companies in two key global markets and found that by connecting assets through data and by generating insights to change processing instructions (throughput speed, time in furnace, temperatures, and other factors) and raw material input mixes, companies could achieve significant results. They were able to:
To capture this opportunity, steel companies do not need to invent a solution on their own. Rather, they can apply lessons learned in other industries to overcome challenges in areas such as organizational culture and data. Moreover, companies need to realize that implementing AI is a journey in which capabilities and experience are built over time. Steel remains a largely analog industry, but forward-looking organizations can take steps to change that—and separate themselves from the pack.
The steel industry’s slow adoption of AI is understandable, given the fact that some mills have been operating for a century or more, leading to deeply entrenched cultures and processes. The talent in steel companies still consists of equipment operators and teams of metallurgists who develop recipes based on experience and trial and error.
Data is a challenge as well. Older plant equipment may not capture the right data—or any at all. Newer machinery may not be connected in ways that enable the organization to track data accurately and consistently across an entire process. Some front-end processes, such as melting, milling, and annealing, can be analyzed and improved without connected data, but doing so for back-end finishing operations can be particularly tricky. Most capital investments have not factored in data collection, and most plants—even new builds—do not comprehensively capture strategic data or include the necessary governance to manage and deploy it effectively.
Both of these issues—culture and data—are exacerbated by the fact that steel manufacturing involves a series of complex steps that are rarely perfectly consistent. Many batches are split off into different pieces, making it hard to track the lineage of any individual part.
Yet those challenges are not insurmountable. In fact, the relative lack of progress in AI so far means that steel manufacturers have a good opportunity to start generating benefits in terms of cost and yield. All that’s required is a mindset of experimentation, a commitment to starting small, and a willingness to change.
The ultimate goal of any AI initiative is to have a largely automated and unsupervised process, with fully linked production assets and software that can autonomously adjust variables to generate a specific outcome—and then to improve performance over time. For most manufacturers, that reality remains in the distant future. Given their low level of AI maturity, companies need to walk before they can run. Based on our experience working with steel manufacturers, we believe long-term success requires a three-step transformation journey. (See Exhibit 1.)
1: Conduct Initial Pilots with Supervised AI (Months 1–6)
The first step is to develop basic AI capabilities and a better understanding of the company’s underlying process. A best practice at this stage is to focus on a specific product (or product family) with good data and sufficient sample size (run history) and track its production from end to end, rather than analyze a discrete process. In parallel, pilots in this first step will allow a company to understand how its data is currently being captured and deployed.
From there, you can run supervised AI for the designated product alongside the current metallurgy team. The goal is to look at the variables across the process to better understand which factors have the biggest impact in reaching a target outcome and how well the software can predict that outcome. If the AI software can’t accurately predict outcomes—highly likely at first—you’ll need to determine where you’re missing data and resolve those gaps. (Notably, the software required is not complex; it relies primarily on well-established statistical techniques like multiple regression.)
In this first step, we see three common issues that steel manufacturers experience: an inability to trace parts and pieces across the entire production process, a failure to collect or store data on critical variables, and cultural resistance due to the perception that AI is cumbersome. (See Exhibit 2.)
However, all three of these challenges can be resolved with the right approach. For example, a US steel manufacturer saw a significant opportunity to improve product yields but was operating a disconnected set of assets and had never before been able to view end-to-end production process data. The company had tried to improve quality through interventions in specific processes, but its approach was slow and unresponsive because it didn’t have visibility into the broader whole. Changing a single process required several months to gather and analyze data and conduct a trial run, and even then it could only assess a small number of factors.
To improve, the company tested a new approach based on AI for several products. It started by creating a first-ever data lake to link as much end-to-end process data as possible for each production run, consolidated in a single data set for analysis. A machine-learning model evaluated parameters across end-to-end production processes and was able to integrate the results from previous runs so that the algorithm could improve over time. The new system was in place in less than eight weeks, and despite significant missing operational data, it was able to create significant insights for the metallurgy team.
Overall, using this first model improved yield by 15% for the targeted product family while also reducing variability, leading to approximately $500,000 in annual value on a single product line after eight weeks of work and three months of validating the model results in production.
While this pilot paid for itself multiple times over, it also drove confidence and excitement across the management team and highlighted key data, infrastructure, and cultural issues that would need to be solved for broader scaling of—and savings generated by—AI.
2: Initiate the “Snowball Effect” to Increase ROI and Fund Future Investments (Months 6–24)
Once you’ve built some basic AI capabilities on a single product and learned about the challenges that will need to be addressed for AI to scale, the next step is to deploy multiple waves of new models —building comfort and capabilities—while investing to address IT and data gaps in parallel. Many companies make the mistake of thinking they need to address all IT gaps before deploying AI across an entire process. But that’s not true. Companies can make faster progress by prioritizing those areas that require a low investment in technology and offer potentially high returns—particularly in critical processes such as melting or milling. Taking that kind of deliberate, sequential approach will lead to rapid momentum and accelerating returns on future AI initiatives, starting a snowball effect. This will drive bigger, bolder investments in major IT gaps and enable higher-ROI applications. (See Exhibit 3.)
The basic methodology is similar to the first step: conduct pilots to evaluate how well AI can optimize parameters at key steps in the process, and gradually expand to higher-ROI applications. In addition, you can start to invest in tools and technologies to capture data across production assets, based on the lessons learned in the first step. The objective is to track product information in greater detail throughout a mill while automating the data capture, with the eventual goal of creating a sufficient IT and data infrastructure that can attach critical data to a single piece of steel and store it in real time.
During this process, it’s important to understand the limits of AI in terms of predictability. Some processes still won’t have sufficient data to allow for accurate decision making, such as with products that haven’t run enough batches to test different parameters. For this reason, it’s important to continue to run the AI alongside metallurgists and operators, with the software making recommendations and the human teams deciding whether those recommendations should be implemented. The parallel approach also helps companies prepare for a fully AI-augmented future, when the technology will run in a more-automated manner. This step also builds the level of comfort and partnership data science and metallurgy teams need to achieve the ultimate goal: fully integrated AI.
3: Embed Fully Integrated AI to Support Autonomous Decision Making (Months 24 and Beyond)
The final step—and the real objective of the three-step approach—is fully integrated AI that can run autonomously and is integrated into operational systems. The outcome of this step is a factory that is always optimizing and refining its end-to-end process steps, organically improving yields, throughput, and costs along the way. For most routine tasks, the AI process can run autonomously and start to either make operational adjustments in-flight or prompt operators to do so, suggesting the correct running parameters.
To get to this point, you need to combine the outcomes of the previous steps: a more-nuanced understanding of which key variables drive yield and throughput, a database that collects those critical variables and attributes them to a part, and comfort from the technical team with the approach and tools. Once these elements are in place, companies will be able to use integrated AI to sense and optimize processing parameters in-flight. Over time, and as more data is collected on key processing inputs, machine-learning algorithms will build on the experience from each batch so that the system becomes smarter. Importantly, metallurgists and operators will always be required, to deal with exceptions and outlier situations where the algorithms fall short (for example, if a machine goes down).
BCG has developed a solution called PHOSA that is already delivering this level of autonomous AI in other process industries. The software gets embedded into supervisory control and data acquisition (SCADA) systems, allowing companies to precisely control the production process in response to the material coming in and take specific steps to optimize it in real time. PHOSA has already delivered sizable improvements in a range of companies and has a strong track record of enabling operators with embedded AI to optimize process outcomes.
AI is advancing rapidly, but most steel manufacturers have yet to take advantage of it. Implementing AI requires overcoming some clear challenges in terms of data and organizational culture, but it does not require starting from scratch. The investment needed to launch pilots and to begin developing capabilities is relatively low—and it can often be recovered through yield increases. AI is a readily available tool. Steel companies need to start using it.