Do Your Inspections Work?

[article]
Summary:
Software inspections are meant to uncover defects and save considerable project effort and cost. But how do you know if your inspections are cost-effective compared to testing and other quality activities? Can you even tell if inspections pay for themselves? In this week's column, Karl Wiegers outlines three ways to measure your inspection efforts.

Inspections are a great way to find defects in software work products, but how well do your inspections work? Do they actually reveal defects? Are they cost-effective, compared to testing or other quality approaches? Do they pay for themselves? The only way to answer such questions is to collect and analyze some data from your inspections.

Assessing Inspection Impact
Three measures of inspection impact are effectiveness (the percentage of the defects present in a work product that your inspections discover), efficiency (the average effort required to find a defect), and return on investment (the time your organization saves from inspections compared to their cost).

Calculating effectiveness requires that you count the defects found by inspection, those discovered in later development or testing stages, and those that customers report. If you know your inspection effectiveness, you can estimate how many defects remain in a deliverable following inspection.

Efficient inspections discover many defects per hour of effort. There's a paradox here, though. A successful inspection program leads to process improvements that reduce the number of mistakes that developers make. So, as your product quality improves, the cost of discovering each defect will increase. You need to judge whether decreasing inspection efficiency metrics truly indicate higher product quality or if they mean that your inspections are not working as well as they should.

Return on investment is the net savings from your inspections divided by the detection cost. The net savings are the costs that your team avoided by finding defects early, rather than late. If you know the average cost of finding and fixing a defect during system test or in operation, you can estimate the potential savings from each defect found by inspection. Detection cost includes the effort spent on each inspection plus the overhead costs of your organization's inspection program.

Inspection Metrics
The basic dimensions of software measurement are size, time, effort, and quality. The following data items will help you assess your inspection effectiveness, efficiency, and ROI.

Size.Planned lines of code or document pages that you planned to inspect
Size.Actual lines of code or document pages that were actually inspected
Time.Meeting duration of the inspection meeting
Effort.Planning total labor hours spent on planning, scheduling meetings, assembling the inspection package, and the like
Effort.Overview total labor hours spent on the overview stage
Effort.Preparation total labor hours spent on individual preparation
Effort.Meeting total labor hours spent in the inspection meeting
Effort.Rework total labor hours spent correcting and verifying defects
Defects.Found.Major number of major defects found
Defects.Found.Minor number of minor defects found
Defects.Corrected.Major number of major defects corrected during rework
Defects.Corrected.Minor number of minor defects corrected during rework
Number.of.Inspectors number of people who participated in the inspection meeting

You can calculate several useful metrics from these data items; some are listed below. To jump-start your inspection data storage and metric calculations, use the simple spreadsheet available from www.processimpact.com/pr_goodies.shtml.

Defect.Density number of defects found per unit of material inspected
Effort.Inspection total labor hours expended on the inspection
Effort.per.Defect average total labor hours expended to find a defect
Effort.per.Unit.Size average labor hours expended to inspect a document page or a thousand lines of code
Rate.Inspection average quantity of material inspected per meeting hour
Rate.Preparation average quantity of material covered per labor hour of preparation
Rework.per.Defect average number of labor hours needed to correct and verify a defect 

Data Analysis
As a general metrics guideline, if you don't plan to analyze the data, don't waste time collecting it. You can analyze data accumulated from multiple inspections in several ways:

  • Track averages, such as the number of lines of code inspected per hour (an inspection that goes much faster than average probably missed some defects).
  • Correlate pairs of metrics, such as defect density and preparation time (slower preparation generally finds more defects).
  • Use statistical process control to monitor key parameters, such as defect density (an unusually low defect density could indicate that inspectors missed some defects).

Inspection data can also indicate whether you are catching defects in the same lifecycle phase in which they were created. Because the cost of dealing with defects in subsequent stages increases rapidly, your goal should be 100 percent defect containment, with none leaking into downstream work products.

Some Measurement Caveats
The prime directive of software measurement is that a manager must neither reward nor punish individuals for their metrics results. The first time a practitioner is punished for some data he reported is the last time the manager will get accurate data. Aggregate the data from multiple inspections to monitor averages and trends in your inspection process without compromising the privacy of individual authors.

Beware of measurement dysfunction, which arises when the measurement process leads to counterproductive behaviors. Forms of inspection measurement dysfunction include defect severity inflation or deflation, and distorted defect densities, preparation times, and defect discovery rates. Inspectors who are rated on how many defects they find will report many defects, even if it means debating whether every small issue truly is a defect. Authors whose performance evaluation depends on how many defects inspectors find in their products will avoid inspections, fudge the data, or spend excessive time perfecting a product before inspecting it.

Your Bottom Line
I don't know how many defects your inspectors will find. However, many organizations have saved considerable effort through software inspections, which provides an excellent counter-argument to resisters who fear that inspections will slow the project down. The ROI only has to be a little higher than 1.0 to make inspections worth doing.

About the author

AgileConnection is a TechWell community.

Through conferences, training, consulting, and online resources, TechWell helps you develop and deliver great software every day.