This past summer, Dot Graham and I, Rob Sabourin, were working together in the hallowed offices of Grove House in the picturesque hamlet of Macclesfield, England. I was sharing some recent research and task-analysis experiences in the area of agile project collaboration. In many of the successful agile teams I work with, testers collaborate frequently and directly with programmers, business analysts, customers, and other team members.
I collected many examples by interviewing team members and by observing collaboration in action. I documented each collaboration story. My goal was to build resources to help teach collaboration over and above the generic warm and fuzzy team-building approaches. I want to look deeply into how collaboration takes place.
As I started to share collaboration stories, Dot observed that there were several aspects of agile collaboration that bore a striking similarity to a team-based collaboration technique known as "software inspection." In software inspection, a small group of individuals work together to identify defects, weaknesses, and potential improvements in any software development work product. Software inspections have been used since the late 1970s and are implemented independent of the lifecycle model in use. I have been implementing software inspections since the 1980s, and Dot's book, Software Inspection , co-authored with Tom Gilb, has been an important inspiration and guide.
Dot has taught software inspection techniques to inspection moderators and inspectors over many years, focusing in later years on a lean version of the process called "agile inspection." However, interest in inspection seems to have declined in recent years, or at least people aren't admitting to doing it. It is no longer an attractive topic at conferences or discussed much in blogs, magazines, or forums.
We both feel that it is a real shame that these techniques, which are extremely effective, have been abandoned. What is the reason for this? Have these techniques just "gone out of fashion," or have they stopped working?
The agile story in this article is just one example. The story is real, with the company and context sanitized to protect the innocent. Note that the practices described are imperfect and include adaptations that may vary from recommended agile practices or strict adherence to the Agile Manifesto or agile guiding principles.
I present the story with our comments in italics. Dot comments on similarities to recommendations from the "ancient wisdom" but also highlights possible dangers—lessons learned from the past that can help you in the future.
A Software Review story
Sunsoft is the world leader in its product market. It has built this leadership position through rich product innovations and many strategic corporate acquisitions. Its products run on highend workstations and desktops, and typical development projects involve adding new capabilities to existing product families.
Sunsoft products have been on the market for more than ten years. They are based on a large amount of legacy code that has an eclectic history, poor documentation, and is very difficult to maintain. Frequently, small changes to this code introduce regression bugs that are difficult to identify during development sprints. Implementing new features often requires the modification or complete refactoring of code. Sunsoft does not have automated regression testing of legacy features. Automated unit and story tests are being created for new features but not for legacy enhancements.
Sunsoft implements a variation of Scrum. Each product has several feature teams. Each feature team includes a ScrumMaster, designer, test lead, development lead, and documentation lead. Each team also includes a mix of developers, testers, and writers. Team size does not exceed ten members.
Sprints are three weeks long. Products experience a beta release cycle of about two
|Lessons Learned from Ancient Wisdom.pdf||2.65 MB|