How Does One Measure Agile-Lean Product Development Progress or Results?

[article]
Summary:

Every project measures progress using metrics (or at least should). Agile is no exception and powerful techniques exist within agile to track and measure your project’s progress. However, agile goes a step further by regularly using metrics to adapt and improve with the constant goal of how we can be better today.

“If you do not change direction, you may end up where you are heading."

-  Lao-Tzu

Every project measures progress using metrics (or at least should). Agile is no exception and powerful techniques exist within agile to track and measure your project’s progress. However, agile goes a step further by regularly using metrics to adapt and improve with the constant goal of how we can be better today. An understanding of how to gather relevant information and interrupt the information becomes essential for Agile continuous improvement.

Knowledge Work and Knowledge Workers
More than fifty years ago, Peter Druker in his 1957 book Landmarks of Tomorrow, outlined the challenges he saw for future managers and executives. He concluded that learning how to manage “knowledge work” and “knowledge workers” would be the key management challenge of the next century. He described knowledge work as work that is done in the workers’ head instead of with their hands. He concluded that knowledge work would be the most critical and the highest-valued form of labor. How prophetic he was.

Agile product development is knowledge work. Jurgen Appelo in his book Management 3.0 – Leading Agile Developers, Developing Agile (Appelo, 2011)targets developers, designers, architects, business analysts, testers, and all other types of Agile product (system-software) creators as knowledge workers. 

Along the same lines, Mary and Tom Poppendieck, in their book Leading Lean Software Development – Results Are Not The Point (Poppendieck, 2009) , insightfully tell us: “In knowledge work, success comes entirely from people and the system within which they work. Results are not the point. Developing the people and the system so that together they are capable of achieving successful results is the point.”

Since Agile product development is knowledge work done by knowledge workers, how does one measure the performance of agile product development teams?

The Measure of a Man (or Woman)
The use of measurements and metrics seems like a straight forward distinction but deserves some discussion. These terms are often used interchangeably, yet have very different applications. Generally, a measure is an operation for assigning a number to something. A metric is our interpretation of the assigned number.

For example, when I was a child my mom would have me take off my shoes and stand with my back flat up against the back-side of an open door with my head flat and eyes looking straight ahead. My mom would use a ruler mark my current height on the doorframe with my initials and the current date. We would then interpret the number obtained where she placed my initials as my height. Height in this case is a metric. We would do this periodically and I could compare one measurement and metric to another and see just how much taller I was getting. This was also done with my siblings and we compared our height to one another. Oh, the simple little things that used to bring us fun and pleasure as a child. I went on to apply this measure and metric with my children and now with my grandchildren.

So what are good measures and metrics to use to evaluate your adoption of Agile product development?  How can you measure and compare “how much taller” you’re getting (i.e. your progress)?  Let’s start by looking at the value of measuring.

The Value of Measuring
Ron Jeffries, one of the founding fathers of Agile, wrote an excellent summary of what he considers a truly valuable metric, RTF or Running Tested Features. Ron also wrote the following when asked about using metrics to produce agility.

What is the Point of the Project?
I'm just guessing, but I think the point of most software development projects is software that works, and that has the most features possible per dollar of investment. I call that notion Running Tested [Features], and in fact it can be measured, to a degree.

Imagine the following definition of RTF:

1. The desired software is broken down into named features (requirements, stories) which are part of what it means to deliver the desired system.

2. For each named feature, there are one or more automated acceptance tests which, when they work, will show that the feature in question is implemented.

3. The RTF metric shows, at every moment in the project, how many features are passing all their acceptance tests.

How many customer-defined features are known, through independently-defined testing, to be working? Now there's a metric I could live with.

Peter Hundermark, suggests that Running Automated Tests are one measure:

“Within limits, the more running (i.e. passing) automated tests a team has in place is a positive measure of quality. Beyond a certain level, this will cease to be true, but we have not yet met a team that has reached this point. (We hope to!).”

In addition Hundermark observes in Work In Progress:

“Items (stories) in-progress is a productivity metric. It seeks to help a team track whether they are working collaboratively or not. The idea in an Agile team is for the whole team, as far as is reasonably possible, to collaborate on a single work item until it is ‘done’. This increases the rate of output, quality and cross-learning. It decreases the risk of unfinished items at the end of the Sprint, which results in waste.”

“Simply by tracking on a daily basis how many items the team has in-progress will make visible the extent to which they are collaborating. The chart tracks stories in-progress against days. It is agnostic of Sprint boundaries. It should trend towards 1 over time. Any value higher than 2 is cause for action by the Scrum Master.”

Performance measurement is an essential tool to help us evaluate and to diagnose where improvements are needed.

Using Velocity as a Measure
The classic measure for an Agile product development team, or how much work did the team complete in the last Sprint.

Back in school we learned velocity is a measure of how far something has travelled from the origin plus direction or how fast an object travels by a certain rate. Hence Speed = Distance/Time (v=d/t; where d=distance and t=time).

Agile borrows the term velocity and uses it to measure the rate at which teams consistently deliver business value. When being agile to calculate velocity, simply add up the estimates of the features (user stories) successfully delivered in a Sprint. 

So, the question is frequently asked, “Can we use Velocity to measure the productivity of a team?”  Velocity, based on story points completed per iteration/sprint, is a very team specific metric partly due to the fact it is highly dependent on which developer is doing the work and thus on the developer’s level of domain specific knowledge and technical skill. Additionally, Velocity should not to be used to compare one team’s rate of getting stories done over another team’s rate.

Teams may also have a reason to artificially inflate their story point estimation. For example: Are the story points for a specific story an 8 or 13? Because this lies in the judgment of team, if the team feels the pressure to deliver as many story points as possible because they know their performance is measured by how many points they get done, then more than likely the will assign 13 story points.

A team’s Velocity from iteration/sprint to iteration/sprint can effectively be used to highlight the trend of a team’s ability over time to deliver stories and the point-in-time commercial or operational value is delivered, but other measures should also be used.

Measuring Your Alignment to Agile
Another way to look at measurement is the team’s understanding and adherence to the Agile values and principles and Scrum.  Following are some standard measures, grouped by focus area, to assess your Agile product development and performance.

You use them at three defined intervals – start of an iteration/sprint, end of an iteration/sprint and end of a release of the product – the team answers the following questions (measures) by selecting one of five possible answers (metric).

Story Acceptance Criteria

  • The team understands the value of acceptance criteria?
  • Detailed acceptance criteria are developed for most of the User Stories?
  • User Stories are not completed until the acceptance criteria have been met?
  • Detailed acceptance criteria are available to development before coding begins?
  • Bugs are the direct result of failed acceptance criteria?

Concurrent Testing

  • The team understands the value of automating unit/functional tests?
  • Unit test cases are written prior to coding?
  • Unit test coverage improves with each iteration/sprint?
  • Unit test cases are automated?
  • Functional tests are created and automated from User Stories prior to coding? 

Continuous Integration

  • A single source repository is maintained?
  • Builds are automated?
  • Builds are self-tested?
  • Everyone commits to the mainline every day?
  • Every commit builds the mainline on an integration machine?
  • It is easy for anyone to get the latest executable?
  • Automated deployment to multiple environments?

Managing Software Debt

  • The team gathers precise data on the defects they inject and remove?
  • Technical debt is controlled and goals are established to reduce it?

Iterative Planning

  • Within a release, iterations/sprints are planned for and User Stories are selected for developers to work on?
  • The iteration/sprint deadline is fixed and remaining User Stories are rolled into a future iteration/sprint?
  • The definition of “done” is agreed upon prior to iteration/sprint planning?
  • User Stories are testable, estimable, demonstrable, commercially or operationally value adding, cohesive, loosely coupled and fit into a single iteration/sprint?
  • Iteration/sprints are between one to three weeks in duration?
  • Stand-up meetings occur daily, and they are used to bring obstacles to light, not as a status meeting?
  • Developers break User Stories down into tasks based on the agreed upon definition of done?
  • The extended team, including the customer, does release planning?
  • The Product Owner prioritizes User Stories?
  • A retrospective meeting occurs at the end of each iteration/sprint?
  • A product review meeting occurs at the end of each iteration/sprint?

Learning & Adapting

  • Problems and needed improvements are identified in stand-up meetings?
  • Members of the team are committed to learning, via online learning, technical workshops, certifications, etc.?
  • The team displays a willingness to try new ideas that may be uncomfortable at first?
  • A root cause is identified for reported bugs, resulting in a corrective goal or action plan?
  • Members of the team help coach and support other teams looking for ways to improve?

Possible Answers to Use as Metric

  1. Disagree Completely / Hardly Ever
  2. Disagree Somewhat / Very Rarely
  3. Neutral / Some of the time
  4. Agree Somewhat / Most of the time
  5. Agree Completely / All of the time

The team can then interpret how they are doing and determine where they need to improve by periodically tracking the data trend.

Delivering Business Value
Fundamental to the adoption of agile-lean product development is delivering value early and often to the business and customer. The focus on using a value based metric is around one very simple idea: the primary objective of a project and the product is to deliver value. The more value we deliver the more successful the project or product. Therefore, Velocity, story points, throughput, cycle time, capacity, code quality, etc, are useful only if they help us discover more effective ways of delivering commercial or operational value to the business and customer. 

Business and customer value can be measured in many different ways, here are just a few:

  • Earned Business Value (EBV) - When the business is asking for information about the value of the product – “How much value is being provided?”, this is a metric that measures how "done" we are from a business perspective
  • Earned Value Management (EVM) is a project management technique for measuring project progress in an objective manner. EVM has the ability to combine measurements of scope, schedule, and cost in a single integrated system. When properly applied, EVM provides an early warning of performance problems. Additionally, EVM promises to improve the definition of project scope, prevent scope creep, communicate objective progress to stakeholders, and keep the project team focused on achieving progress
  • Net Present Value (NPV) - the difference between the present value of cash inflows and the present value of cash outflows. NPV is used in capital budgeting to analyze the profitability of an investment or project
  • Customer Satisfaction – is a measure of how products and services supplied by a company meet or surpass customer expectation. It is seen as a key performance indicator within business focused on measures that answer the question “How do customers view us?”

Based on empirical knowledge, first and foremost your measures and metrics should help verify and validate commercial and operational value delivered to the business and customer.

Delivering commercial and operational value early and often is at the heart of Agile product development. As a result this gives the business the best opportunity to beat the competition to market, satisfy the Customer, realize revenue early, and discover insights to continuously improve development processes and the product.

Final Thought
General Dwight D. Eisenhower is often quoted as saying “Plans are useless, but planning is indispensable.” Eisenhower also said: “If you fail to plan, you plan to fail”.

As you plan, if you do not know where you are it is very difficult to get to where you want to go. As you plan your steps, continuous improvement in self, team, enterprise, process, and product should be part of your plan. Measurements and metrics help us determine where we are and guide us to where we want to go. It is important though to measure the things that matter and refrain from measuring things that don't matter.

Deborah Hartman Preuss and Robin Dymond, in their article Appropriate Agile Measurement: Using Metrics and Diagnostics to Deliver Business Value, remind us when designing measures and metrics we should consider not only when to use them, but when to stop using them and how can they be gamed.  A principle we should all remember.

Recommended Reading
The Business Value of Agile Software Methods: Maximizing ROI With Just-in-time Processes and Documentation
, ISBN-13: 978-1604270310, Addison-Wesley Professional (October 6, 2009)

  • A comprehensive quantification of the benefits, from a business perspective, for decision makers who are looking to adopt these methods.

Applied Software Measurement: Global Analysis of Productivity and Quality, McGraw-Hill Osborne Media; 3 edition (April 11, 2008), ISBN-13: 978-0071502443

  • Describes how to accurately size, estimate, and administer software projects with real-world guidance from an industry expert. Fully updated to cover the latest tools and techniques.

Calculating Earned Business Value for an Agile Project, Dan Rawsthorne, http://www.Agilejournal.com/articles/columns/column-articles/54-calculat...

  • Describes how to use a functional Work Breakdown Structure (WBS) as a framework for calculating business metrics (BV, EBV).

Management 3.0 – Leading Agile Developers, Developing Agile Leaders, Jurgen Appelo, ISBN-13: 978-0321712479, Addison-Wesley Professional, 1 edition (January 7, 2011)

  • This book is very pragmatic. Combining solid research with Appelo sharing insights grounded in modern complex systems theory, reflecting complexity of modern software development.

About the Author Russell Pannone is the Founder of We Be Agile and the Agile Lean Phoenix User Group, as well as the Agile-Lean Adoption Lead and Coach at US Airways and Editor-In-Chief of the 
With almost thirty years of system-software development and delivery experience, my focus is on working side-by-side with folks on real projects helping them deliver valuable system-software to production early and often, giving those I collaborate with the best opportunity to beat the competition to market, realize revenue and discover insights that we can use to help us improve.

About the author

AgileConnection is a TechWell community.

Through conferences, training, consulting, and online resources, TechWell helps you develop and deliver great software every day.