Lessons Learned in Performance, Stress, and Volume Testing

[article]
Member Submitted

the application without telling the test engineer what the application’s breaking points are. Test manager should understand the definitions and consequences of each testing effort.

Not disabling processes not affecting the outcome of the test
I was in a project where a business process was recorded and playback and every time the process was played back it was kicking off a print job and sending output to a printer that wasted hundreds of pages every time it did so. The actual printing process did not cause or generate any traffic within the application under test. It is important to understand what processes are not necessarily associated with the system’s performance and to disable these processes if they are launched from the performance test. So if during the performance the test engineer is wasting printed-paper, or triggering events such as emails, etc and these processes are not affecting the outcome of the performance test then these processes should be disabled.

Not extrapolating results
If a stress/performance/volume tests were conducted in an environment that does not even come close to emulating the actual production environment then results would need to be extrapolated from one environment to the other. Many projects fail to do so and assume that if the application under test has an adequate response in one environment the same holds true for another environment without extrapolating results. There are many tools in the market from multiple vendors to help companies extrapolate results and if the test environment is a smaller scale of the production environment these third-party tools should be employed for extrapolating the test results.

Lessons Learned
The aforementioned lessons are some of the lessons that I have learned after many years of hands on experience with automated testing tools and in particular with tools that emulate traffic within an application. Whether you want to test your application’s performance and response times by hand or with an automated testing tool it is important to foster a culture within your organization that follows best practices, is consistent and repeatable. Initially testers have a proclivity for resisting rigorous and proven testing processes that deviate from their own experiences but it is the role of the test manager to foster proven practices and not to repeat costly lessons.

Many companies are highly dependent on their software and application as their only source of revenue and do not have the luxury of having an application be so slow, or inflexible that customers will not use it again. As an example companies on the web that sell products such as books or CDs where these products are their only source of revenue cannot afford to have a customer experiencing excessive delays for buying a book, or cannot have a customer unable to log on to buy a CD because the application is inoperable with a maximum load of concurrent end users. Test managers have the onus of understanding the application's traffic and thoroughly testing the application for performance before deploying the application to a production environment.

The test manager should document lessons learned from his/her own project and from other projects and store these lessons learned in a commercial tool or in-house repository. The test manager should ensure that the testers whether they are company employees or contractors understand the lessons learned and take corrective action to follow best testing practices at all times.

About the author

Jose Fajardo's picture Jose Fajardo

Jose Fajardo (PMP, M.S., and SAP certified) has worked as a test manager for various companies utilizing automated testing tools. He has written and published numerous articles on testing SAP and authored the book titled Testing SAP R/3: A Manager's Step by Step Guide. Throughout his career Jose has helped to create testing standards and test plans, mentor junior programmers, audit testing results, implement automated testing strategies, and managed test teams. Jose can be contacted at josefajardo@hotmail.com.

AgileConnection is one of the growing communities of the TechWell network.

Featuring fresh, insightful stories, TechWell.com is the place to go for what is happening in software development and delivery.  Join the conversation now!

Upcoming Events

Oct 12
Oct 15
Nov 09
Nov 09