Hidden Messages

[article]
Summary:

A defect management system contains data such as how many defects have been raised, the priority and severity of individual defects, and even who is raising them. This information is regularly used by program and test management to guide decision making. In this article, Dan Minkin proves that an experienced test manager can gather useful information by looking at more than just the defect management system's data.

As a defect manager, or more specifically, as a test manager in charge of defects, I was never so popular as when I started to use a whiteboard to keep up-to-date with the latest found and fixed defects. Suddenly senior program management knew who I was and what I did. Every day they'd ask how many new defects were raised, what the priorities were, which ones were critical, and even who was raising them-data I could readily provide from my whiteboard. As the numbers went up, there would be more concern; as they came down, less. But as time passed, I realized that a lot of the useful information that was found on the defect management system couldn't be reported using metrics. These were the hidden messages that were important to decipher.

Defect Description and Developer Comments
Much of what turned out to be useful information was found in the defect description and consequent developer comments. What mattered was how it was written, not what the information was. While some testers would report defects in step-by-step detail, including preconditions and all relevant data, others might simply state that the system had crashed or that a calculation was wrong. Everybody, it seemed, developed his own style.

The large variety in styles used to report a defect told me a number of things. It told me who were the experienced and confident testers and who were not so confident. It told me which technical areas caused frustration, which became apparent in the language ("surely the developers can see..." "this is the third time..."), and, in so doing, added to the weight of evidence that certain functional areas were high risk.

Also, the developer's replies and consequent debate told me something about the relationship between the developers and testers. Again the language defined much of this, providing some clear evidence that relations weren't as good as they could have been. The number of replies-often three, four, or five from each side-indicated that either the defect resolution process wasn't working and that improved contact between development and the test team was needed, or that there was some confusion about what the requirements were. As it happened, both were true.

I had always been quite happy that the program management was committed to using a test management tool. Management had invested a sizeable amount of money in buying the tool and extra defect module licenses. But was this commitment being reflected by the testers and developers? I asked myself whether or not the defects were being kept up-to-date, whether descriptions and replies were too technical or conversely too business focused in nature, and did it seem like the suggested approach to defect description was being followed or was lip service being paid? Were all the fields correctly filled in or were the defaults just being taken? What became clear was that some people revelled in the use of the tool and used it to its fullest capabilities, and others who were used to different ways of working, needed coaching and encouragement to use the tool. This was hardly a groundbreaking revelation, but it was a useful one nonetheless. It subsequently led to the discovery that those who didn't relate to the tool often related worse to other elements of the formal testing process, such as test specifications.

Defect History
The history of a defect sometimes gave me interesting information. Some defects had extensive history that detailed the status and ownership changes, some being open for several months. Once, when trying to track down why a given defect had been closed months before (with no explanation),

Pages

About the author

Dan Minkin's picture Dan Minkin

Dan Minkin is a testing consultant with Certeco in the United Kingdom. As a test and project manager for the past ten years, his focus is on the practical and pragmatic application of testing and test management.

AgileConnection is one of the growing communities of the TechWell network.

Featuring fresh, insightful stories, TechWell.com is the place to go for what is happening in software development and delivery.  Join the conversation now!

Upcoming Events

Sep 24
Oct 12
Oct 15
Nov 09