Use Cases and Testing

Testing UML Models, Part 1

we are verifying that the use case description contains correct and proper information. We ask three kinds of questions: Is it complete? Is it correct? Is it consistent? In one project I worked on, more than half the use cases failed syntax testing. Should we proceed with further implementation and testing given that level of quality?

Oh, I almost forgot. And you must keep this a secret. You do not need to know the answers to any of these questions before asking them. It is the process of asking and answering that is most important. Listen to the answers you are given. Are people confident about their answers? Can they explain them rationally? Or do they hem and haw and fidget in their chairs or look out the window or become defensive when you ask? Now for the questions:


  1. Are all use case definition fields filled in? Do we really know what the words mean?
  2. Are all of the steps required to implement the use case included?
  3. Are all of the ways that things could go right identified and handled properly? Have all combinations been considered?
  4. Are all of the ways that things could go wrong identified and handled properly? Have all combinations been considered?


  1. Is the use case name the primary actor's goal expressed as an active verb phrase?
  2. Is the use case described at the appropriate black box/white box level?
  3. Are the preconditions mandatory? Can they be guaranteed by the system?
  4. Does the failed end condition protect the interests of all the stakeholders?
  5. Does the success end condition satisfy the interests of all the stakeholders?
  6. Does the main success scenario run from the trigger to the delivery of the success end condition?
  7. Is the sequence of action steps correct?
  8. Is each step stated in the present tense with an active verb as a goal that moves the process forward?
  9. Is it clear where and why alternate scenarios depart from the main scenario?
  10. Are design decisions (GUI, Database, …) omitted from the use case?
  11. Are the use case "generalization," "include," and "extend" relationships used to their fullest extent but used correctly?


  1. Can the system actually deliver the specified goals?

Domain Expert Testing
After checking the syntax of the use cases we proceed to the second type of testing- domain expert testing. Here we have two options: find a domain expert or attempt to become one. (The second approach is always more difficult than the first, and the first can be very hard.) Again, we ask three kinds of questions: Is it complete? Is it correct? Is it consistent?


  1. Are all actors identified? Can you identify a specific person who will play the role of each actor?
  2. Is this everything that needs to be developed?
  3. Are all external system trigger conditions handled?
  4. Have all the words that suggest incompleteness ("some," "etc."…) been removed?


  1. Is this what you really want? Is this all you really want? Is this more than you really want?


  1. When we build this system according to these use cases, will you be able to determine that we have succeeded?
  2. Can the system described actually be built?

Traceability Testing
Finally, after having our domain expert scour the use cases, we proceed to the third type of testing-traceability testing. We want to make certain that we can trace from the functional requirements to the use cases and from the use cases back to the requirements. Again, we turn to our three kinds of questions: Is it complete? Is it correct? Is it consistent?


  1. Do the use cases form a story that unfolds

User Comments

1 comment
Fiona Benichou's picture
Fiona Benichou

Hi Lee,

In addition to this article, I recently purchased and enjoyed your book on “A Practitioner’s guide to software test design”. The techniques are really helpful! Is there a tool/software downloads/test management system available that encompasses ALL of the test design techniques discussed in the book?

April 9, 2013 - 1:16pm

About the author

Lee Copeland's picture Lee Copeland

Lee Copeland has more than thirty years of experience in the field of software development and testing. He has worked as a programmer, development director, process improvement leader, and consultant. Based on his experience, Lee has developed and taught a number of training courses focusing on software testing and development issues. Lee is the managing technical editor for Better Software magazine, a regular columnist for, and the author of A Practitioner's Guide to Software Test Design. Contact Lee at

AgileConnection is one of the growing communities of the TechWell network.

Featuring fresh, insightful stories, is the place to go for what is happening in software development and delivery.  Join the conversation now!