Automated Software Testing

How I fought the Software Testing Automation War and have Determined I'd Rather be a Stone Mason
Member Submitted

This article discusses the road blocks I have encountered while attempting to acquire consulting engagements for myself and others. They are a result of the management misconception that all you have to do to automate software testing is to write test scripts. The immediate effect being that QA and testing functions are being staffed with test script coders at the expense of test designers. It all reminds me of a cartoon from the golden age of computer programming where the manager says to the programmer, "You start coding and I'll go find out what the system is supposed to do." The software testing corollary would be "you start testing and I'll go find out what we are supposed to test."

Two thousand and one has been a very depressed year for quality assurance and software testing employment in the St. Louis area and it has brought an issue to the forefront that I feel needs to be addressed. Under better economic circumstances major St. Louis corporations such as Anheuser Busch, Maritz, Monsanto, Ralston, etc. employ a fair number of QA and testing consultants, as well as, permanent employees. Recent circumstances have affected the employment patterns and have caused a problem that was buried to surface. By trying to get the most “bang for their buck” so to speak, employers have exacerbated the problem.

The problem has two aspects. The first is a lack of understanding of the software testing process and how automation fits. The second is a lack of understanding of the test engineer roles as they pertain to the testing process. We have reached the age of specialization and software testing is no exception. Test engineers can specialize in test planning, test design, and/or test implementation. Each role requires a different set of manual and automated testing skills. Test planners and test designers may work in parallel or may be the same person and they should possess similar abilities to define and document testing artifacts. Test implementers on the other hand are programmers when they are doing automated testing. They write and execute the test scripts and capture the test results.

As for the automation aspects of testing, I have worked for the past ten years implementing test automation for these and other local companies, and I have emphasized that an automated test script is not "intelligent" in and of its self. The intelligence is in the test data. The test data must contain smart tests that are based on the software’s requirements definitions and design specifications. Intelligent refers to the data’ ability to stress known/expected application weaknesses. The intelligence must be built into the test data by the test designer s prior to their use with the automated test scripts.

The test data must also include values that the script uses to navigate the application under test. The only way to insert smarts into a test script is to hard code the data values in the test script proper, or in a function/subroutine that the script calls. This is what happens when the tradition Capture/ Playback automation approach is used. The result is a test script maintenance nightmare.

The test designer must specify the test preconditions, navigation of the application under test, what aspects to test, and what posttest cleanup is necessary. This is all necessary information that the test implementer (script writer) must have to construct an effective automated test script. The test designer uses the requirements definition documents and the design documents to identify potential test conditions (areas of the application that may have a predisposition towards having defects) and their expected results. The test designer also uses any other information that describes the application that might be available.

Writing automated test scripts is not necessarily any better unless they execute intelligent test data. I have emphasized that developing test data design for automated tests must be coupled with manual testing so that the manual tests are also smart. I usually design test data by using the requirement and design documents while I have the application running on a test machine and MS Excel open on my desktop machine. When I identify a test requirement, I document the test conditions, environmental prerequisites for that requirement in an Excel workbook sheet. Then I manually execute the test data on the test computer. If it gives the


About the author

AgileConnection is a TechWell community.

Through conferences, training, consulting, and online resources, TechWell helps you develop and deliver great software every day.