Speaking to Your Business Using Measurements


Test Velocity
Test velocity is how fast testing is moving along. I have a hard time wrapping my head around this one. There are a few ways people like to measure this such as test cases run per period of time, or stories completed over a period of time, but this is all so tightly woven into the development process that I have a hard time thinking in terms of test time. Folks into lean sometimes think of test velocity with the takt concept. Here are some problems with measuring test velocity; tests take different amounts of time to run, so velocity isn’t a consistent measure, this measure is further skewed because of activities like data setup, bug investigation, and reporting.

Can you share a few more ways to find validity problems with test velocity?

The term reliability is used to describe how consistent the results of a measurement are. This book categorizes reliability into three types: quixotic, diachronic, and synchronic. Quixotic reliability applies readily to test measures. Measures with Quixotic reliability are unvaryingly consistent, but trivial and misleading.

Here are a couple metrics that suffer from reliability issues:

Number of Test Cases
The Number of test cases is a measure that has lots of reliability problems, here are a couple to think about; test case count can be gamed to inflate numbers , how do you count tests if they aren’t documented in a traditional way? What other ways might counting test cases be unreliable?

You probably noticed that the most of the solutions I like for measurement problems, aren’t actually measurement. A guy named Taiichi Ohno had huge success with this technique at a company he worked for in post-World-War-Two Japan.

Ohno spent a considerable amount of time talking and working side by side with factory workers, and customers. This helped him to quickly learn what was and wasn’t working and make immediate changes. His work reshaped how the manufacturing world thought about business.

You may have heard of that company—it’s called Toyota. I'd love for you to try some of these ideas out and tell me about how they work for you!


User Comments

Allan Mosak's picture

Number of test cases and Number of test cases passed are both meaningless metrics.  What is key is how effective the test cases are - in other words, what percentage of the application's requirements are being tested, and what percentage of the application's requirements have been shown to function properly.  As often as not, testers perform thousands of automated tests, most of which pass, without having any idea what percentage of requirements they have covered.  In most cases, coverage is pitifully low - usually less than 50%.

March 19, 2014 - 5:17pm
Justin Rohrman's picture

Can't coverage be equally meaningless though? As far as I understand, coverage is only meaningful when you are talking about what you are covering. Requirements could be an example of that, method coverage could be another. The problem I see with these is that they tell you very little about the testing that was done. In measuring requirements coverage you know that a requirement was tested somehow but you don't know that the testing was meaningful or useful.

I do like using coverage models to show that something important may have been missed. 

March 19, 2014 - 9:43pm

About the author

Justin Rohrman's picture Justin Rohrman

Justin Rohrman is a software tester living in Nashville, TN. He currently serves on the board of Nashville Association of Software Quality Professionals (NASQP), instructs BBST classes for the Association for Software Testing, and facilitates sessions for Weekend Testing Americas. You can get in touch with him via email at rohrmanj@gmail.com, on twitter @justinrohrman, or on his website at http://justinrohrman.com.

AgileConnection is one of the growing communities of the TechWell network.

Featuring fresh, insightful stories, TechWell.com is the place to go for what is happening in software development and delivery.  Join the conversation now!

Upcoming Events

Sep 22
Sep 24
Oct 12
Nov 09