Seven Deadly Sins of Software Reviews

[article]
Member Submitted
Summary:

This article describes seven common ways that technical reviews go wrong. Symptoms of the problem and suggestions for getting the review back on track also are presented.

A quarter-century ago, Michael Fagan of IBM developed the software inspection technique, a method for finding defects through manual examination of software work products by a group of the author's peers. Many organizations have achieved dramatic results from inspections, including IBM, Raytheon, Motorola, Hewlett Packard, and Bull HN. However, other organizations have had difficulty getting any kind of software review process going. Considering that effective technical reviews are one of the most powerful software quality practices available, all software groups should become skilled in their application.

This article describes seven common problems that can undermine the effectiveness of software reviews of any type (inspections being a specific type of formal review). I describe several symptoms of each problem, and I suggest several possible solutions that can prevent, or correct, the problem. By laying the foundation for effective software technical reviews and avoiding these common pitfalls, you too can reap the benefits of this valuable quality practice.

Participants Don't Understand the Review Process
Symptoms: Software engineers don't instinctively know how to conduct and contribute to software reviews. Review participants may have different understandings of their roles and responsibilities, and of the activities conducted during a review. Team members may not know which of their software work products should be reviewed, when to review them, and what review approach is most appropriate in each situation.

Team members may not understand the various types of reviews that can be performed. The terms "review", "inspection", and "walkthrough" are often used interchangeably, although they are not the same beast. A lack of common understanding about review practices can lead to inconsistencies in review objectives, review team size and composition, forms used, recordkeeping, and meeting approaches. Too much material may be scheduled for a single review, because participants are not aware of realistic review rates. It may not be clear who is running a review meeting, and meetings may lose their focus, drifting from finding defects to solving problems or challenging the author's programming style. Results from these points of confusion are typically missed defects, frustration, and an unwillingness to participate in future reviews.

Solutions: Training is the best way to ensure that your team members share a common understanding of the review process. For most teams, four to eight hours of training will be sufficient, though you wish to obtain additional specialized training for those who will play the role of moderator in formal inspections. Training can be an excellent team-building activity, as all members of the group hear the same story on some technical topic and begin with a shared understanding and vocabulary.

Your group should also adopt some written procedures for how reviews are to be conducted. These procedures will help review participants understand their roles and activities, so they can consistently practice effective and efficient reviews. Your peer review process should include procedures for both formal and informal reviews. Not all work products require formal inspection (though inspection is indisputably the most effective review method), so a palette of procedural options will let team members choose the most appropriate tool for each situation. Adopt standard forms for recording issues found during review meetings, and for recording summaries of the formal reviews that were conducted. Good resources for guidance on review procedures and forms are Software Inspection Process by Robert Ebenau and Susan Strauss (McGraw-Hill, 1994) and Handbook of Walkthroughs, Inspections, and Technical Reviews , 3rd Edition by Daniel Freedman and Gerald Weinberg (Dorset House, 1990).

Reviewers Critique the Producer, Not the Product
Symptoms: Initial attempts to hold reviews sometimes lead to personal assaults on the skills and style of

Pages

About the author

Karl E. Wiegers's picture Karl E. Wiegers

Karl Wiegers, Ph.D., is the Principal Consultant at Process Impact in Portland, Oregon, and the author of Software Requirements (Microsoft Press, 1999) and Creating a Software Engineering Culture (Dorset House, 1996). You can reach him at www.processimpact.com. You can find more than 35 of Karl's articles at www.processimpact.com/pubs.shtml

AgileConnection is one of the growing communities of the TechWell network.

Featuring fresh, insightful stories, TechWell.com is the place to go for what is happening in software development and delivery.  Join the conversation now!