is done by hierarchical grouping (as in file/directory structures) or by tagging test cases with one or more group tags isn't as important as ensuring that your groupings can change over time. The group definitions will require revision control.
Test Case Relationships
So how does testing relate to the other parts of your CM environment? These relationships need to be tracked by your CM tool suite. Testing generates problem reports. Here's how I would break it down.
- Beta testing is related to customers - problems appear as customer requests which may be filtered into development problems.
- Sanity testing is related to builds - problems appear as new development problem reports.
- Bug testing is related to problem reports - problems are already present and need appropriate test data tracked.
- Black box testing relates to product requirements - problems appear as verification problem reports and are related to a requirement.
- Stress testing relates to product marketing - results fed into marketing and product management; may spawn some verification problem reports
- White box testing relates to product design - problems appear as development problem reports
- Change testing relates to software updates - problems are resolved before the change is finally checked in and promoted to "ready" status.
As you can see, problem reports will come in many flavours. The CM tool suite must be able to sort out customer data from development data. It must be able to identify the development phase in which the problems originated.
But the CM tool must also be able to relate both the tests and the spawned problem reports to the product development side of the fence. It must be able to address critical questions. Which requirement is the test testing? How far through the integration testing are we? What sort of coverage do our test cases provide? Which test cases are new to this development stream?
The V-Model View
Another way I like to look at relationships is to use a view which is similar to the one that has come to be known as the V Model.
In the V Model diagram, the Verification activities on the right side of the V address the Development activities on the left side of the V, at the same level. Typically, at least within a build cycle, the activities follow sequentially down the left side of the V and up the right. As such, the horizontal axis is really one of time.
The V Model shows a clear relationship between testing and development artifacts. This same relationship should be clearly visible from your CM environment.
From Build to Beta
To generate a build, the system integration and build team creates a build record identifying exactly what is going into the build. This is used to extract the files required and generate the deliverables. These are loaded onto one or more testing platforms to ensure the basic sanity of the build. Sanity is assessed by running a sanity test suite, a critical first step of which is to "initialize" or "boot" the system. Sanity testing should be tracked against the build record. It should include the baseline of sanity test cases that was used to perform the tests. This is typically a brief regression test suite. It should also contain a list of problem reports generated as a result of the sanity testing. As an extended bit of sanity testing, some of the higher priority "fixed" problems are verified as being fixed. Those not fully fixed should be marked as re-opened. Your CM system should allow you to identify the frequency of re-opening of a problem report.