Agile Works—but Are You Measuring the Impact?

Related Expertise: デジタル/テクノロジー/データ, 人材戦略, アジャイル・アット・スケール

Agile Works—but Are You Measuring the Impact?

By Matthew AliberPeter HildebrandtMehran IslamAndrew JenningsErik LenhardDavid Ritter, and Filippo Scognamiglio

Agile isn’t the new kid on the block anymore. But for most companies, an agile transformation still feels like a leap of faith. These transformations require companies to redesign organizational structures, alter operating models, build new capabilities, and change long-established ways of working.

And the expectations are sky-high—agile promises to double speed to market, boost customer satisfaction and quality, increase efficiency, and reduce costs. Evangelists speak of large organizations unlocking tens to hundreds of millions of dollars in value. Can these claims be backed up with real numbers?

The answer is yes. Generally speaking, there are two ways to measure the impact from agile: measure how agile teams deliver value and improve over time or compare the value agile teams create with the value from past waterfall work. When agile transformations are under way, we strongly encourage the former approach, since it is forward-looking and leads to actionable insights. However, companies that are just starting on an agile journey or do not yet have broad buy-in from the executive leadership team often find themselves challenged to prove that agile works in their organization. In this case, the latter approach is useful.

Here we touch on some of the ways that measuring the impact of agile can go wrong, highlight an approach that has worked well, illustrate how companies have used this approach—and showcase the astonishing results they have seen.

What Doesn’t Work

Trying to measure the results of an agile transformation can lead companies astray in many ways. Metrics used to track success in a waterfall context rarely translate to an agile environment, and some actually encourage behaviors that run counter to agile values and principles. Here are just a few approaches to avoid:

  • The Assembly Line Approach. If you try to measure the impact of agile in the same way an automaker calculates productivity on an assembly line, you’ve already lost. Agile is about encouraging experimentation, not eking out every ounce of efficiency. Focusing on unit productivity or lines of code per developer hour, for example, is a mistake. In agile, we seek to measure outcomes, not just output—and teams can pivot when they recognize work that does not contribute to outcomes and key results.

  • The Hypothetical Approach. Some companies measure agile projects against previously created plans. The problem is that these plans are hypothetical and often change during implementation, as evidenced by the usual flurry of change requests and revisions. For a fair comparison, what’s needed is a baseline of actual projects, not hypothetical plans.
  • The Sanitized Approach. If broad metrics are summarized or translated for consumption by leaders, the real story can be diluted or obscured. Without a single source of truth, trust between teams and leaders can break down. Some measures can be aggregated to help management get a big-picture view, but data should be purely rolled up from the actual numbers, not re-interpreted.
  • The “One Metric That Matters” Approach. Focusing on a small number of metrics can be helpful, but one is seldom enough to maintain proper balance in priorities. For example, teams that solely track customer loyalty scores might ignore cost, sustainability, or other key factors.

What Does Work

Effective agile organizations establish a balanced set of measures to track the business value that teams create and the health and welfare of those teams. Effective agile teams also need these measures in order to continuously improve.

Business value metrics are typically specific to a team’s work. For example, if a team is automating a business process, we’d expect it to be held accountable for reduced errors, increased throughput, or lower cost of execution—outcomes that require the process to be implemented, not just delivered.

The health and welfare of a team is typically shown by metrics that track quality (escaped defects), predictability (meeting sprint commitments), frequency of delivery (reduced lead or cycle time), and engagement of team members (through regular pulse checks).

Regardless of the metrics, it’s essential to establish meaningful baselines. If you can’t determine where you started, you can’t measure how far you’ve come. Many organizations track the progress of teams only once they’ve begun sprinting. After working together for a few sprints, most teams will begin to show measurable improvements in value delivery, quality, velocity, and cycle time, all key indicators that the agile model is working.

To prove the merits of agile relative to the company’s traditional way of working, it’s generally necessary to undertake a challenging analytical process to establish the historical baseline. This requires capturing the costs and timing of past waterfall projects. But you cannot simply download data on all prior waterfall projects and use that as a baseline for comparison with agile projects. A simple data dump will yield very little of value because the comparisons that result will not be apples-to-apples.

For a meaningful comparison of past waterfall projects and agile projects, companies need an approach akin to the appraisal method for valuing real estate. Here’s how it works.

One of our clients had data on 2,000 prior waterfall projects, but not all the data points were comparable to the data from the agile projects the company wanted to assess. Using a quantitative approach, this list was winnowed down to a much smaller number of projects—around 100—that were similar in size, scope, cost, and timeline to the agile projects in question. This is not unlike the process realtors go through when searching for “comps” in a specific city, town, or neighborhood as they prepare to put a house on the market.

The next step was to enlist experts within the company to manually review all 100 projects and eliminate outliers and projects with special circumstances; for example, a team might have performed poorly because the leader became ill and was out for several weeks during crunch time. Or a team might have performed heroically because it worked overtime during a holiday break. When the manual filtering process was complete, 30 to 40 projects were left as a basis for comparison. In the same way, a real estate appraiser will eliminate houses that offer an unfair basis for comparison, such as those that are located near high-voltage power lines or have fallen into serious disrepair.

Metrics such as quality, speed, and employee engagement from this refined baseline and from the agile projects could then be compared directly. However, to achieve a reliable cost comparison, further adjustments were required to account for the variances between the projects. The client factored in project complexity and the need for internal coordination, much as a real estate appraiser arrives at a property value by factoring in square footage, number of bedrooms, or lot size. The client identified and quantified key drivers of difference through multivariate regression, reviewed the results with experts to ensure that the findings were plausible, and adjusted the comparisons in the baseline accordingly. This allowed for an apples-to-apples comparison of project cost and enabled the client to quantify the productivity increase achieved through agile ways of working.

How Companies Are Measuring Agile Success

Companies using this filtering method have gained valuable—and quantifiable—insights into the impact of their agile transformation. Here are just a few examples.

A 55% Acceleration in Time to Market. A top-three North American financial institution used the filtering method to quantify the speed of execution of agile projects and waterfall projects. The company first identified three phases that apply to both agile and waterfall projects—up-front preparation until start of development, development and testing, and the final integration testing and preparation for release—and then compared each phase in the two project types. The agile projects achieved a 55% reduction in time to market, driven largely by decreases in the up-front preparation time (of 65%) and the development cycle time (60%). Final integration testing and preparation for release saw a 20% reduction because there were no late-stage issues and changes.

Threefold to Fourfold Increase in Speed to Revenue. A large global wealth and asset management company dramatically accelerated time to value when launching an innovative product in the market. In the past, product launches of similar size and complexity using traditional ways of working had taken 15 to 18 months. Using agile, the company was able to go from product idea to national launch in just five months. As one top executive said, “This speed to customer value is unheard of in our organization and in the industry. We beat our top competitors—and just imagine the P&L acceleration.”

Cost Decreases of 25%. A global financial services company used this method to quantify how agile affects efficiency as measured by project costs. First, the company analyzed drivers of project costs and found that the main driver was the number of departments involved—that is, how many groups had to work together to achieve the objective. When compared with the company’s past waterfall projects, the agile pilots, which were organized around collaborative teams empowered to make independent decisions and move quickly, had led to cost savings, but the company wanted a more finely grained view. After curating waterfall project data, aligning and refining comparable dimensions, and adjusting for complexity, the company found that agile projects were lowering total project costs by as much as 25%.

A 50% Increase in Quality. A major North American financial institution used this method to analyze agile’s effect on the defect rate. Using a baseline of comparable projects, the company discovered that products created with waterfall had 65 defects of severity 1 or higher after release, but products created with agile had just 34. The company also found that 15% of defects in waterfall projects were the result of misunderstood requirements, compared with just 1% to 2% for agile projects.

Improved Predictability. A global media and entertainment company looked at the schedule slippage and budget overruns of both waterfall and agile teams. “The average schedule slippage was 67% less for agile teams than for waterfall teams—just over three months compared with almost nine—and the agile teams consistently delivered at or under budget. By comparison, waterfall teams had an average budget overrun of 45%.

A 30% Boost in Employee Engagement. A major bank measured the improvement in employee engagement by anonymously polling teams, asking whether team members were “excited about the work” they were doing, rated on a scale from “strongly agree” to “strongly disagree.”  Before the agile transformation, around 60% “strongly agreed,” but the scores increased to more than 90% after 14 weeks of working in an agile fashion. Interestingly, the results for agile projects fluctuated during that period. Scores were sky-high in the first four weeks but then dipped for several weeks to less than 50% (in other words, below the initial readings) as employees came to grips with the challenges inherent in new ways of working. Once employees got their feet beneath them and began to appreciate the benefits of agile, engagement levels shot back up, stabilizing at approximately 90%.


As more and more companies embark on agile transformations, boards, shareholders, and regulators will demand more transparency from their executive teams. They will want to know the return on investment for the transformations and to understand how ROI is measured. By adopting the filtering approach, companies can establish quantifiable proof points that illustrate the ways agile is fulfilling its promise—and pave the way to scale it across the organization.

Subscribe to our Digital, Technology, and Data E-Alert.

Subscribe to our Digital, Technology, and Data E-Alert.