Lessons Learned in Performance, Stress, and Volume Testing

[article]
Member Submitted

functional knowledge of the application under test should work with the subject matter experts and testers with functional knowledge of the application to identify enough sets of unique data records to repeat the performance/stress/volume/load test with distinct or unique values of data. For smaller applications the DBA may restore and refresh the database after each performance/stress/volume/load test to cleanse the database for values entered during a previous test run, in this way when the performance/load/volume/stress test is repeated with the same set of records the records will not be cached or buffered.

Failure to notify system users, and disable users
I have witnessed a project where in the middle of a performance/load/volume/stress test execution other users not associated with the test logged on to the environment where the application was under test and therefore skewing test results. Furthermore these unexpected users that logged on to the environment in the middle of the test called the helpdesk and the system administrator several times to complain about the performance of the system and thus wasting time from other groups.

This goes without saying but please notify in advance all users via email or system messages when a performance/volume/stress/volume test will take place, what date, what time, and in which environment. As a second precautionary step disable all users log-on ids from the environment where the test will take place that are not associated with the test and only permit emulated end-users to log-on to the environment under test with the previously created dummy user ids (e.g. user001, user002, etc) and other users that are associated with the test for tasks such as performance monitoring of the test.

Throughput emulation. Same number of end users as emulated end users
Throughput emulation. Same number of end users as emulated end users I was present in a project where the application’s middleware engineer wanted the system to be tested with a load of emulated users that was in fact unrepresentative of the total number of expected concurrent end users in the production environment. The actual expected number of concurrent end users in production was 1200, whereas the middleware engineer wanted the automation tool to only log-on 120 emulated end users, since he calculated a ratio of 10 to 1 where 1 emulated end user from the automated testing tool works at the rate of 10 actual human end users.

Emulated end users from an automated testing tool work at a constant rate, and have the ability to perform multiple iterations without stopping unlike a human being, and they also have the ability to generate the as much traffic and execute business processes per hour as the expected number of concurrent end users in productions. However by utilizing a ratio of emulated end users to the number of end users the test engineer may not be able to find out if the application under test can actually withstand the actual log-on of the maximum number of expected production end users. Thus in the example above the application the test engineer may demonstrate that the application under test can support 120 emulated end users logged on simultaneously but how can the test engineer say with any confidence that the application above can withstand 1200 end users logged on concurrently. Another reason to emulate the actual expected number of end users in production is that the test engineer and DBA can ascertain that the application under test’s database has the number of database connections correctly set-up.

To avoid having an emulated end user that works at a rate that is in fact much faster than the work rate

About the author

Jose Fajardo's picture Jose Fajardo

Jose Fajardo (PMP, M.S., and SAP certified) has worked as a test manager for various companies utilizing automated testing tools. He has written and published numerous articles on testing SAP and authored the book titled Testing SAP R/3: A Manager's Step by Step Guide. Throughout his career Jose has helped to create testing standards and test plans, mentor junior programmers, audit testing results, implement automated testing strategies, and managed test teams. Jose can be contacted at josefajardo@hotmail.com.

AgileConnection is one of the growing communities of the TechWell network.

Featuring fresh, insightful stories, TechWell.com is the place to go for what is happening in software development and delivery.  Join the conversation now!

Upcoming Events

Nov 09
Nov 09
Apr 13
May 03