I've written in a previous article about massive test strategy documents devoid of strategic thinking and chock-full of copied text, either from other projects or from the MTP, which was copied from ... well, you get the idea.
Then we have supposedly more-detailed test plans, one for each type of test, often containing detail copied from the test strategy along with—we hope—some useful information about what you actually plan to do and what you need to accomplish it, now that you've had time to know something about it.
While the test manager is going prematurely gray over all these documents and schedules, the testers are off developing "test cases." Ideally, they're exercising creativity and superior analysis skills to generate test ideas, which they then capture as high-level test cases and approaches. But too often, they're locked into looking only at documented requirements and spending vast amounts of time diving into minute detail, scripting and typing expected results down to the point-and-click level. Have you noticed how much redundant detail test scripts contain? "Test steps," input data, and expected results get copied and pasted from test to test and script to script.
Why is this useful? It might be necessary if you're developing performance test scripts. But for manual testing, no skilled tester needs anything like this level of detail. And if you aren't working with skilled testers, why not?
Like plans, test cases and scripts are frequently outdated by the time you start testing, because the requirements or the GUI have changed.
All these standard, old-fashioned documents are heavyweight, inflexible, repetitive, and difficult to read. They are very expensive to produce, and their value is often questionable. Should you really be spending more of your stakeholders’ money preparing for testing than you spend working with the software and developing your product—the information they need you to produce?
I'm not saying that all test documentation is useless. Rather, I'm saying that the thinking tester or test manager should decide what is essential to the building of her real product in each given circumstance. Rather than manufacturing acres of continuous prose in boilerplate documents that almost no one reads anyway, think about the minimum set of artifacts you need to:
- Capture your significant thoughts and ideas
- Guide (yourself or others) in moving critical work forward
- Inform your stakeholders so they have opportunities to provide feedback
- Demonstrate due diligence for the record (where this is required)
Perhaps you could do all the plans in one thin document—say, a test strategy and high-level plan. Or, why not try using alternative, lighter-weight media like mindmaps and diagrams?
Instead of complaining that the written requirements aren't testable or sufficiently detailed or that they're changing too much, try spending your time thinking about and listing test ideas that will spark other ideas when you eventually get your hands on the software. Stakeholders may believe they need detailed scripts to review before you test, but it's actually easier for them to understand and review test ideas and test conditions. Many testers are now using mindmaps for test ideas and cases.
Throwing out the standard big documents in favor of optimal test artifacts won't be easy. Project managers, quality assurance people, and other people who don't understand testing are going to resist. So, why not start small? Talk to your stakeholders about what your product really is. Try a reduced documentation approach on one small project first, and then gradually extend it. Your stakeholders might really like the savings and the sharpened focus.
Let's take testing back! Knowing your product will show you the way