Note: This material will appear in the forthcoming book, Agile Adoption Patterns by Amr Elssamadisy (ISBN 0321514521, Copyright: Pearson Education). The material is being provided by Pearson Education at this early stage to create awareness for this upcoming book (due to publish in July 2008). It has not been fully copyedited or proofread yet; we trust that you will judge the content on technical merit, not on grammatical and punctuation errors that will be fixed at a later stage.
The Done State practice is a definition that a team agrees upon to nonambiguously describe what must take place for a requirement to be considered complete. The done state is the goal of every requirement in an iteration. It is as close as possible to deploying software as a team can come.
Defining and adhering to a done state directly affects time to market and visibility. The closer you come to deployable software, the more confidence you have in your progress and the less you have to do to get ready to release. Cost is reduced because you pay for defect fixes early. Of course, the further your teams definition of done state is from deployable software, the more risky your estimates are because you are less confident of your progress and the more you have to pay in time and effort for correcting defects.
Initially, the team agreed on a done state that included all automated developer tests passed and acceptance tests manually run by Aparna and Cathy and verified by the testing team. The first Iteration had many uncompleted stories because development was completed only a short time before the end of the time box, and Aparna and Cathy found several defects. The team wanted to count the stories 80 percent done. Caleb strongly discouraged this and was able
to convince the team not to do it even though they had only 20 percent completion. The completion percentage was discouraging, and Caleb played cheerleader to keep spirits high. Over the next two iterations, developers complete the stories in sequence instead of trying to do all of them at once. This resulted in stories being complete earlier, which left enough time for the feedback cycle with testing. The team averaged about 85 percent completion over the next few iterations.
Then, when the team picked up functional testing and started writing executable acceptance tests at the beginning of each iteration (test-driven requirements), the completion rate shot up very close to 100 percent because developers were able to fully test the requirements at their desk. The done state was changed from just passing the automated developer tests to passing both the automated developer tests and the functional tests.
You are on a development team performing iterations; this implies that you need specific, measurable goals for the requirements to be met at the end.
- Reporting on partial work done is error prone; at worst, we are 90 percent done 90 percent of the time.
- The closer a requirement is delivered to deployable, the less uncertainty your team has about the true state of the system.
- Only functionality that is delivered to the customer has real value.
- The closer a requirement is delivered to a deployable state, the more defects you have found and eliminated.
- Depending on your environment, it may be difficult to get close to deploying.
- Integration in traditional software teams is error prone and difficult.
Your team should try to eliminate as much partial work done as possible every iteration. A requirement should be built all the way through, including integration, acceptance testing, and as close