created would be bound by these standards.
Change control for all developed code was implemented; all code was stored in Microsoft Visual Source Safe. All pertinent documentation was also maintained under change control.
A library of stand-alone functions began to grow, the ability to use and re-use these methods or functions was supported by the fact that every member of the team was adhering to the standards, that the scripts and master test plans created might differ by application and functionality but the manner in which they were created was identical.
The sizing standards and tool were refined as experience indicated that through the use of the stand-alone functions and following a common development method the time required to create the automated product could be reduced.
Tools to produce plan files and element methods were created and put into use, further increasing the team member’s efficiency.
The work in progress and completed work was tracked using an Excel spreadsheet. By the close of the year 2001 there were approximately 8,300 automated test cases in the inventory.
Although significant progress had been made and measured, there was a strong desire to improve what was being done. None of what the team accomplished came cheaply. Automation is expensive. Several analysis efforts did indicate however that for one of the Flagship products 2610 tests were run during a certification effort. With 0 percent of those cases automated this was a 9-day/11-resource effort. With 12 percent of 3274 test cases, certification required 6 days/6 resources. At a point where 55 percent of 3216 test cases were automated, the certification effort required 4 days/6 resources.
The cost of testing during 2001 was significantly reduced. But automation was still an expensive line in the budget.
Investigation into several articles about data driven automation revealed that if applicable, this method could not only significantly reduce the size of the scripts created during automation but would provide more robust scripts requiring less maintenance and thus reducing the cost of rework.
Data Driver.inc was created. This was the tool used to process .CSV files in a generic manner dispatching the instructions to function/method calls. It was determined by analysis that approximately 79 percent of the test plans presented for automation would lend themselves to this method of automation. Templates were created to use during the creation of the test plans and the automation process. A document defining the creation of a test plan for data driven automation was provided to the test staff.
A sample data driven test plan was created and demonstrated to the management team for their consideration. In addition to the demonstration of this method, a cost analysis comparing the existing cost of producing automation using functional decomposition to the projected cost of producing data driven automation was provided to the management team. The cost projections indicated at least a 40 percent reduction in the cost of automation and a 50 percent reduction in maintenance costs.
The next marketing effort was to present this method to the test staff for consideration. They would be impacted, their test designs would change and their planning would be different. There would be benefits specific to the test analyst, since the test data resided in the .CSV files, they had control over how much of a test would be run, which data would be used, and now had the ability to add or delete data as desired without the assistance of the automation team.
Several projects were selected for betas, one of which ran the same test with ten thousand lines of data. The results were convincing and