Understanding Software Performance Testing, Part 4

[article]
  • a minimum of three times to ensure consistency in the results.

Figure 3

Once the bugs are out of the scripts, we need to run the remaining tests. What we run depends on what types of tests we selected earlier and what the overall goals and objectives of the performance are. The number of test runs and the combinations of test types will depend on the tool in use and its capabilities.

Prior to running large, complex tests, it is important to check to see if the application is truly ready for extensive performance testing. If you have a simple application with only a few users and processes, a performance test may be a waste of effort. For a typical application involving dozens, hundreds, or even thousands of users, processes, and activities, the system becomes more complicated and time consuming, meaning plenty of things can easily go wrong. Therefore, before investing a large amount of time and effort into testing the performance of an application, it often pays to run a smoke test to determine whether more extensive testing is warranted and if the application is ready.

Smoke tests can be as simple as performing a few manual (single-shot) tests with a stopwatch to running a fully automated suite of tests that execute a performance (and functional) regression test of the entire application. These smoke tests can then be used as an entry criterion for performance testing by the testing team.

Caution should be used in a single user test, as the initial results may not be the optimum. I use a simple process for smoke testing:

  • Identify all transactions to be part of the performance test.
    • Create three scripts for each transaction
      • One using low-end field values
      • One using high-end field values
      • One using middle field values
  • Run a test using one user with all transactions.
  • Run a test with two to three users with all transactions.
    • This will help verify system stability as well as transaction readiness.

This is just one of many possible entry criteria for performance testing. The key is to be sure the system is ready for extensive, heavy testing before you expend the resources. If the system is not stable and useful, then the results from the test may be worthless or, worse, misleading.

The high-value, low-value tests are based on the concept of equivalence partitioning and boundary analysis . By exercising transactions at their extremes, we can be sure that the general functionality of the application is intact and will not affect performance characteristics.

If the development team is using an incremental or iterative approach to application development, a smoke test can be incrementally built and applied to each increment.

Test execution will be controlled by the capabilities of the load-generation tool selected for the test. Each type of test selected earlier can be run as an individual test:

  • An average load test
  • A ramp-up test
  • A stress test
  • Etc.

For some tests, we may want to run a series of tests to see how the system reacts. For a usage test using average load, a slow incremental approach can be best:

  • Start with one-third average expected load.
  • Ramp the tool up to the next third.
  • Finally, reach average.
  • Allow the test to run for the specified measurement period.

Some bottlenecks occur at low volume, and too fast of an increase can hide the cause. If you can cause performance problems with a limited number of features and a small number of users (virtual users), you don't need the entire system or application in place. This is another reason to use an overlapping

About the author

Dale Perry's picture Dale Perry

<span class="Text">With more than thirty years of experience in information technology, <strong>Dale Perry</strong> has been a programmer/analyst, database administrator, project manager, development manager, tester, and test manager. Dale's project experience includes large-systems development and conversions, distributed systems, and online applications, both client/server and Web based. He has been a professional instructor for more than fifteen years and has presented at numerous industry conferences on development and testing. With Software Quality Engineering for eleven years, Dale has specialized in training and consulting on testing, inspections and reviews, and other testing and quality-related topics.</span>

AgileConnection is one of the growing communities of the TechWell network.

Featuring fresh, insightful stories, TechWell.com is the place to go for what is happening in software development and delivery.  Join the conversation now!