Partner & Director
Related Expertise: Power and Utilities, Energy Transition, Business Resilience
A collaborative approach to scenario modeling across the ecosystem can improve investment decisions, lower costs, and smooth the journey to net zero.
The challenges of network planning are intensifying for energy distribution networks worldwide, as various trends—including the rise in the number of electric vehicles, the growth in behind-the-meter generation, and the increase in demand from new energy-hungry industries—create big shifts in supply and demand as well as a whole new level of complexity.
These challenges are compounded by distribution networks’ and other ecosystem players’ need to continue making major decisions related to asset replacement and investment. These ecosystem players include energy retailers, small-scale renewables developers, and EV charging networks. Their decisions will further impact the patterns of supply and demand in the distribution network.
Clearly, the planning efforts of all players in the distribution ecosystem would benefit from better coordination and transparency. But the traditional planning method—energy networks forecasting demand in an area and building capacity after coordinating with local planners and developers—is not suitable for this challenge. What’s needed is next-generation planning: an ecosystem-wide collaborative planning effort that facilitates scenario-based decision making that considers the implications for low-voltage networks.
Next-generation network planning leverages advanced analytics and modular technology platforms to facilitate better multiparty collaboration and scenario-based decision making. This planning involves three successive sets of analytical activity: visualizing the current state, simulating future scenarios, and optimizing investment and ecosystem decisions.
To achieve the required business outcome, the technology must be designed for and integrated into the way people work from day one. It’s critical to consider upfront how human decision makers will interact with the algorithm and supplement it with their expertise. In our experience, a customized platform-based solution with a transparent built-for-purpose algorithm facilitates the best outcomes.
Although there are no off-the-shelf solutions on the market yet, the technology required to implement a bespoke solution is mature. Cloud-based infrastructure means computational capacity is within easy reach; AI, simulation, and optimization algorithms are significantly faster and more flexible than in the past decade; and there has been considerable development in easily implemented human interfaces.
So, the primary focus should not be on selecting technology but on integrating the technology and algorithms with human decision making. The most advanced energy networks are using a bionic approach, with a cross-functional team to perform each set of analytical activity and to configure the output to facilitate business engagement and adoption along the way.
Energy networks should begin by creating a visualization that shows a network’s current economics, performance, and constraints in one place. When creating this visualization, it’s important to consider what types of decisions are being made, what is being solved, and the types of data that are needed.
Rather than trying to build a visualization that addresses every possible business problem, companies should start with a few of the most important use cases. That way, it will be possible to create value with the first minimum viable product.
The use cases will differ depending on the business context of a particular network. Some networks may want to focus first on significantly accelerating the approval process for new customers so their service can be connected more quickly. Others may look to identify the priority feeders to replace with standalone power systems. And still others may want to develop a platform for the network-retailer-developer ecosystem to coinvest in community batteries, EV chargers, or microgrid developments.
The ideation team that is assigned to work on the visualization solution should be a cross-functional group, with representatives from asset management, the control room, network operations, distributed energy resources management, the new-connections team, and finance and regulatory departments. The ideation team should also include relevant external stakeholders, such as energy retailers or renewables developers, depending on the use case.
The ideation team should brainstorm to identify what matters most for the business outcome, the decisions that could be made differently, the key variables and constraints, and, therefore, what could be valuable to visualize.
A cross-functional product team then should develop digital tools that enable interacting with the data. Among other things, the product team’s activities identify what is required to give users confidence in the validity of the analytics and provide the foundation for the technical design of the visualization and user interface prototypes.
It is also at this point that the hunt for data should begin. Companies often wait to start developing a visualization until the data is perfect, but there is more value in understanding the implications of data quality issues and incorporating this knowledge into the design and use of the analytics.
To turn the visualization into a solution that can be used in the future, networks need to understand what their physical constraints and their economics will be at a granular level in a variety of future situations. We recommend simulating multiple scenarios using a variety of factors (such as growth in demand, an uptick in rooftop solar photovoltaic panels and storage devices, and changes in EV penetration and users’ charging behaviors). Physical simulations of a network provide a sense of the technical constraints, while economic simulations provide insight into the network’s economics and the implications for broader societal outcomes.
These kinds of simulations require greater sophistication on the part of technologies and the people using them. Digital twin simulation technology is key. It enables business users to intuitively build a digital representation of the network and interact with the scenario models. A scenario model and its interface should be flexible enough for users to adapt the constraints to changing technologies and regulations. At this stage, a scenario model does not include algorithms that can prevent or minimize breaches of these constraints. Any violations, therefore, are highlighted rather than resolved. This combination of flexibility and visibility facilitates iterative improvement of a scenario model.
From a product development perspective, the focus needs to shift from engineering precision to what matters for decision making. It’s important to keep in mind that the most detailed level of information should not automatically be used. The key is to identify the minimum level of detail that will enable decisions to be made differently and with sufficient confidence.
It’s also important for a scenario model to be able to quantify the impact (for example, the risk) of various constraints and other changes consistently across a broad variety of scenarios. That makes it possible to identify the chief ways to influence the outcomes and choose the ways worth investing in. By ensuring scenario management and governance, the analysis behind investment decisions can be repeated and audited.
To enable multiple players in the ecosystem to participate in scenario modeling, the digital twin should be built on a platform that facilitates collaboration and that has specific interfaces customized for different users. Given that the platform will be used by external ecosystem players, it’s extremely important to ensure that it is cyber resilient.
After a variety of future scenarios have been simulated, it may be beneficial to add an optimization engine to the scenario models to enhance the human decision making. The beauty of adding a mathematical optimization engine is that it can compare multiple combinations of the various pricing and tariff levers that are available to the network and the broader ecosystem. By bringing the best potential solutions to the attention of the human decision makers, an optimization engine helps them select and iteratively refine the most-suitable solutions.
Networks (and investors) will need to shift their focus. Increasing the regulated asset base can no longer be depended on to generate returns. Companies need to widen their lens, optimizing investments across the ecosystem to achieve good outcomes for both themselves and society at large.
We recommend that networks deploy a white-box optimization engine, which makes outputs transparent to ecosystem players. A white-box engine also makes it possible to evaluate the potential design tradeoffs of various scenarios. The amount of detail, particularly for constraints, that’s built into a scenario model can be weighed against the amount of time it takes to generate a scenario.
It’s also important for users to be able to modify the objective function, or goal. That way, users can compare optimization solutions that focus on one of the three goals: the least cost to serve, the greatest level of distributed energy resources that the grid can integrate, or the fewest interruptions for each customer. These solutions can then be compared with options that weight each outcome equally.
Collaborative ways of working are as important in investment optimization as they are in scenario modeling. The insights generated by the analytics should be used to support decision making by the network, its customers, energy retailers, developers, and other ecosystem participants. This approach opens the door to innovative solutions and helps build a shared vision among stakeholders.
Developing a bionic approach to next-generation network planning is not a one-off exercise, nor should it be a long journey before a network sees results. Networks that deploy an agile approach, follow the principles of lean product development, and unlock value quickly have an opportunity to take a leading role in shaping the future of the distribution grid. But only by harnessing the power of data and technology in combination with collaboration can they make this a reality. This approach will promote better outcomes—cleaner, reliable energy at a lower cost for customers; better investment decisions for the networks and other ecosystem players; and a more coordinated energy transition for all.
The authors thank their former colleague Nasim Pour for her contributions to this article.