Automated Software Testing

How I fought the Software Testing Automation War and have Determined I'd Rather be a Stone Mason
Member Submitted

expected results, I document it in my spreadsheet and move on to the next test condition/requirement. In this manner, I identify the test data, and document the pretest environmental variables and what post test clean up is needed to assure that each test is independent. The test data are exported to a format that an automated test script can use.

Those of you that frequent such user groups as the SQA users group and the DDE users group know that I am an advocate of the "Data Driven" framework. My associate, Bruce Posey has developed his on flavor of this paradigm that he terms “Control Synchronized Data Driven Testing.” Carle Nagle of the SAS Institute has a different perspective in that he has developed a “Data Driven Engine” using Rational Software’s SQABasic language and MS Excel.

The primary difference in the two approaches is that control synchronized data driven testing stores navigation rules and special test procedures as subroutines that the main test script uses directly, whereas in the DDE model these are tasks are executed by a special set of test script libraries that constitute the engine. In a sense the DDE is test automation middleware. The second difference is that the DDE engine is driven by a set of test tables that are developed as Excel spreadsheets, but which contain special commands that tell the engine where to go and what to test. In the control synchronized approach this information is attached to the actual test data that can be stored as CSV files or data pools.

The problem with either approach is that they both depend on the test data and extraneous information about those data to execute the actual test(s). If the data are not focused, the tests that are conducted are ineffective. The only way to focus automated testing is to develop test data from whatever software development process artifacts that are available to the Test Planner/Designer, or to pick the brains of the system analysts and designers.

Well, I have deviated quite a bit from the point of this article and for that I apologize, but a little background was necessary. The only current jobs, other than one or two calls for manual testers, in St. Louis IT organizations are those that involve writing automated test scripts. The problems this situation is causing include forcing test engineers to play all of the roles in the testing process, Test Planning Engineer, Test Design Engineer and Test Implementation Engineer. This was true to some extent even before the current economic down turn; however, the employment opportunities for test planner and test designers have all but evaporated. Even the positions that advertise for a broad background in test automation really only want people to code test scripts. Now, coding test scripts is an honorable profession, but most individuals I have encountered who enjoy implementing the tests by writing and executing the test scripts do not enjoy planning and designing the tests. Consequently, they do not put forth a good effort when asked to do so.

My personal work preference is test design and not test implementation. When I consult within the local IT organizations, the service I provide is that of test requirements identification and documentation, and test data design and construction. Since the beginning of the summer test designer consulting positions have virtually dried up. All of the pre-employment interviews I have attended lately have focused only on test script writing with an emphasis on how quickly you (the potential consultant/employee) can develop a suite of automated test scripts that will provide automated regression testing.


About the author

AgileConnection is a TechWell community.

Through conferences, training, consulting, and online resources, TechWell helps you develop and deliver great software every day.