Looking for What's Not There

[article]
Summary:

This column asks the all-important question, "What isn't there that should be?" The same idea for spotting black holes also applies to spotting "holes in designs and requirements." For example, there are often connections between the quantity of bugs filed against an area and whether the area is thoroughly tested. There can also be holes in what KIND of bugs have been reported. Hendrickson lays out the territory for the search and goes on to suggest how to "look for where there's a lot of nothing."

"Dad, how do they find black holes?" I asked this question many years ago while walking with my father on a clear winter night, admiring the stars. Dad looked down at me, hands stuffed into his pockets, our breath visible in the cold air, and smiled. "Well, kiddo, since they can't see what's not there, they look for where there's a lot of nothing."

The idea stuck: find something by watching for nothing. It's how I look for holes in designs and requirements. If nothing in the design says anything about security, no one is thinking about it yet. If there are no requirements from a given stakeholder, chances are that stakeholder has not been represented in the requirements process.

It's also one way that I monitor the testing effort. If an area has no bugs filed against it, it's an indication that it hasn't been tested. Of course, just because a lot of bugs have been filed about a given area doesn't mean that it is well tested. So I also look at the kinds of bugs that have been filed. Do they get to the heart of the functionality or are they all superficial? Do any bugs involve bad input data, buffer overruns, or long path names? I'm looking for an indication that there's a hole in our testing—kinds of bugs that haven't been filed yet.

Looking for the absence of information isn't easy. You have to know what you expect to see before you can notice that it is missing. That means that you need a mental list of bug categories you might expect to find in the software under test: simultaneous user problems, data corruption bugs, timing issues, etc. Your list depends on what your software does.

You also need to know, in detail, what has already been seen. If you survey the bugs that have been found to date—reading them rather than counting—you can look for patterns in the testing that led to finding the bugs. Just counting isn't enough. What's missing doesn't always fit neatly into a drop down list in the bug tracking system.

As the project progresses, looking for patterns of missing bugs becomes more difficult. The more bugs that have been filed, the more difficult it is to spot the areas with a dearth of bugs. And yet this is the time when it is critical. You're almost done. Soon, someone along Executive Row will start asking why you haven't shipped yet. The next thing you know, the software is released and/or posted to the Web. That's not the time to find out that no one tried searching the catalog in two different Web browsers simultaneously.

One way to look for testing holes is to tabulate bugs on a matrix as they're filed in the bug tracking system. Along one side, divide the software under test into areas. Then list categories of tests that apply to all areas along the top. For example, if you are testing an editing program, the areas along the side might include Draw Tool, Text Tool, Insert Picture, Printing, etc.

About the author

Elisabeth Hendrickson's picture Elisabeth Hendrickson

The founder and president of Quality Tree Software, Inc., Elisabeth Hendrickson wrote her first line of code in 1980. Moments later, she found her first bug. Since then Elisabeth has held positions as a tester, developer, manager, and quality engineering director in companies ranging from small startups to multi-national enterprises. A member of the agile community since 2003, Elisabeth has served on the board of directors of the Agile Alliance and is a co-organizer of the Agile Alliance Functional Testing Tools program. She now splits her time between teaching, speaking, writing, and working on agile teams with test-infected programmers who value her obsession with testing. Elisabeth blogs at testobsessed.com and can be found on Twitter as @testobsessed.

AgileConnection is one of the growing communities of the TechWell network.

Featuring fresh, insightful stories, TechWell.com is the place to go for what is happening in software development and delivery.  Join the conversation now!

Upcoming Events

Sep 22
Sep 24
Oct 12
Nov 09