problem or is malfunctioning. In order to have fully maintainable and properly recorded automated test scripts the technical tester or core test automator should have access to a well-constructed and fully documented test script that serves as the basis for recording automated test scripts.
5. Automated testing speeds up the testing effort
This misconception is pointed out and that is that many test managers and project directors mistakenly believe that creating automated scripts speeds up corresponding testing activities. Creating automated scripts is a daunting, time-consuming and monumental task in particular when one is doing so for the very first time on an entirely new release of a software system. Creating automated scripts requires having valid documented test cycles, test scenarios and test scripts that can be translated into an automated script. Planning for the creation of a test case and test script with valid data is in itself an initial input for the automated script (the output), since the automated script is a by-product of the testing cycle. Identifying the requirements and scenarios that need to be tested, and creating the testing cycles and automated scripts are resource intensive and time-consuming activities that pay-off for subsequent releases of the software where the previously created automated scripts can be re-executed with minor modifications or "as is".
The initial hurdle and obstacle is to create the first batch of automated scripts and then leverage off these automated scripts for later software versions or releases with minor modifications or no modifications at all. Realistically, a testing team that can successfully automate 40-50% of the test cases that are applicable for automated testing when test case automation has been attempted for the first time has indeed met a major testing challenge./p>
6. Maintaining scripts a previously recorded test script is always easy
This assumption is wrong! Maintaining some automated test scripts may prove to be a logistical nightmare. Sometimes a core automator may have to modify an automated test script that another core automator created but has left the company already. If information is missing as to what the automated test script is supposed to do, or if the previous core automator that left the company did not follow best practices in developing the test script the new core automator might struggle to modify the previously recorded the script or worst yet might have to create the new test script from scratch. Another obstacle that makes test script maintenance difficult is that some parameterized scripts require multiple data sets, which may create multiple permutations for executing the automated test script since each data set in the parameterized test script might require error handling or if/else programming logic.
The test manager should ascertain that best practices are followed in creating the automated test script along with appropriate documentation for executing the script and comments embedded within the test script. Furthermore, to the extent possible automated test scripts should be modularized.
7. Functional testers record scripts and the technical testers revise the poorly recorded scripts
A final misconception that I would like to bring to everyone’s attention is that some test managers actually have functional testers record test scripts even though they know that the functional testers have little or no knowledge of the automated testing tool. The rationale for this misconception is that test managers want to have skilled technical testers (core automators) worked off a previously recorded script and then revise the test scripts that were poorly recorded by the functional tester as a means to speed up the automation testing activities. This practice is highly inefficient and I would advise against doing this.