If your reviewers can participate only in different times and places, or even at different times in the same location, use asynchronous review approaches. The simplest such method is a peer deskcheck, in which the author asks one colleague to look at a work product. A peer deskcheck depends entirely on the single reviewer’s knowledge, skill, and self-discipline, so expect wide variability in the results. A passaround is a multiple, concurrent peer deskcheck, with several reviewers invited to provide input. As an alternative to distributing physical copies of the document, you can place an electronic copy in a shared file. Reviewers can provide their feedback in the form of document annotations, such as Microsoft Word comments or PDF notes.
Asynchronous reviews address some of the potential shortcomings of traditional peer reviews. These include insufficient preparation prior to the meeting, personality conflicts, and meetings that segue into problem solving or deviate on other tangents. The author should expect to spend some time following up on comments made by specific reviewers. He can do this face-to-face if geography permits or by telephone if it does not.
Asynchronous reviews have their own shortcomings. Because participants contribute input over a period of time, asynchronous reviews can take several days to complete. Some volunteers won’t find the time or motivation to contribute to an asynchronous review. In addition, asynchronous reviews lack the physical meeting that focuses the participants’ attention and stimulates the synergy that enhances defect discovery. Some people don’t bother to contribute when they see that someone else has already responded. The initial contributors to the discussion can set its direction if their comments are visible to all participants from the beginning.
Several collaborative tools can enhance asynchronous (or even traditional) reviews, although few are commercially available. ReviewPro from Software Development Technologies (http://www.sdtcorp.com/reviewpr.htm) provides many features to support both asynchronous and concurrent reviews, including a threaded discussion feature to let reviewers comment on issues that are raised.
Philip Johnson and his colleagues developed the Collaborative Software Review System (CSRS), available under the GNU public license (see Johnson’s "Design for Instrumentation: High Quality Measurement of Formal Technical Review," Software Quality Journal, March 1996). Used in conjunction with a review approach called FTArm (Formal, Technical, Asynchronous Review Method), CSRS first allows reviewers to raise private issues about the item being reviewed. Next, the tool permits them to view, respond to, and vote on issues and action proposals contributed by other reviewers. Tools such as CSRS capture more details of discussions and the thought process behind them than a recorder can note during a traditional fast-moving review meeting.
Engaging review participants in different locations or at different times is challenging. However, the benefits that distributed and asynchronous peer reviews provide to collaborative software projects make them worth trying when the reviewers cannot meet in person.