we are verifying that the use case description contains correct and proper information. We ask three kinds of questions: Is it complete? Is it correct? Is it consistent? In one project I worked on, more than half the use cases failed syntax testing. Should we proceed with further implementation and testing given that level of quality?
Oh, I almost forgot. And you must keep this a secret. You do not need to know the answers to any of these questions before asking them. It is the process of asking and answering that is most important. Listen to the answers you are given. Are people confident about their answers? Can they explain them rationally? Or do they hem and haw and fidget in their chairs or look out the window or become defensive when you ask? Now for the questions:
- Are all use case definition fields filled in? Do we really know what the words mean?
- Are all of the steps required to implement the use case included?
- Are all of the ways that things could go right identified and handled properly? Have all combinations been considered?
- Are all of the ways that things could go wrong identified and handled properly? Have all combinations been considered?
- Is the use case name the primary actor's goal expressed as an active verb phrase?
- Is the use case described at the appropriate black box/white box level?
- Are the preconditions mandatory? Can they be guaranteed by the system?
- Does the failed end condition protect the interests of all the stakeholders?
- Does the success end condition satisfy the interests of all the stakeholders?
- Does the main success scenario run from the trigger to the delivery of the success end condition?
- Is the sequence of action steps correct?
- Is each step stated in the present tense with an active verb as a goal that moves the process forward?
- Is it clear where and why alternate scenarios depart from the main scenario?
- Are design decisions (GUI, Database, …) omitted from the use case?
- Are the use case "generalization," "include," and "extend" relationships used to their fullest extent but used correctly?
- Can the system actually deliver the specified goals?
Domain Expert Testing
After checking the syntax of the use cases we proceed to the second type of testing- domain expert testing. Here we have two options: find a domain expert or attempt to become one. (The second approach is always more difficult than the first, and the first can be very hard.) Again, we ask three kinds of questions: Is it complete? Is it correct? Is it consistent?
- Are all actors identified? Can you identify a specific person who will play the role of each actor?
- Is this everything that needs to be developed?
- Are all external system trigger conditions handled?
- Have all the words that suggest incompleteness ("some," "etc."…) been removed?
- Is this what you really want? Is this all you really want? Is this more than you really want?
- When we build this system according to these use cases, will you be able to determine that we have succeeded?
- Can the system described actually be built?
Finally, after having our domain expert scour the use cases, we proceed to the third type of testing-traceability testing. We want to make certain that we can trace from the functional requirements to the use cases and from the use cases back to the requirements. Again, we turn to our three kinds of questions: Is it complete? Is it correct? Is it consistent?
- Do the use cases form a story that unfolds