Conference Presentations

Ten Indispensable Tips for Performance Testing

Whether you are inexperienced with performance testing or an experienced performance tester who is continuously researching ways to optimize your process and deliverables, this session is for you. Based on his experience with dozens of performance testing projects, Gary Coil discusses the ten indispensable tips that he believes will help ensure the success of any performance test. Find out ways to elicit and uncover the underlying performance requirements for the software-under-test. Learn the importance of a production-like test environment, and methods to create suitable environments without spending a fortune. Take back valuable tips on how to create representative workload--mix profiles that accurately simulate the expected production load. And more! Gary has developed and honed these practical and indispensable tips through many years of leading performance testing engagements.

Gary Coil, IBM Global Services
Preparing for the Madness: Load Testing the College Bracket Challenge

For the past two seasons, the Windows Live development team has run the Live.com College Bracket Challenge, which hosts brackets for scores of customers during the "March Madness" NCAA basketball tournament. March Madness is the busiest time of the year for most sports Web sites. So, how do you build your Web application and test it for scalability to potentially millions of customers? Ed Glas guides you through the process their team uses to model users, establish performance goals for their application, define test data, and construct realistic operational scenarios. Learn how the tests were conducted, the specific database performance and locking problems encountered, and how these problems were isolated and fixed. Finally, Ed demonstrates the custom reporting solution the team developed to report results to stakeholders.

  • How to establish performance goals and requirements
Eric Morris, Microsoft
Challenges in Performance Testing of AJAX Applications

The AJAX model for Web applications has been rapidly gaining in popularity because of its ability to bring the richness and responsiveness of desktop applications to the Web. Because one of the key drivers for the rapid adoption of AJAX is its promise of superior performance, it is surprising that there has been very little discussion of AJAX-specific performance testing. In fact, AJAX has a significant impact on aspects of the performance testing lifecycle including definition of goals, user modeling, and test scripting. Rajendra Gokhale discusses issues to consider: AJAX engine simulation and optimization, cross-client performance of AJAX applications, and design choices related to test scripting. Using Google's "Google Suggest" service as a case study, Rajendra examines the unique challenges of carrying out performance testing of AJAX-based applications and offers suggestions for overcoming them.

Rajendra Gokhale, Aztecsoft
Performance Testing Web Applications with OpenSTA

OpenSTA is a solid open-source testing tool that, when used effectively, fulfills the basic needs of performance testing of Web applications. Dan Downing introduces you to the basics of OpenSTA including downloading and installing the tool, using the Script Modeler to record and customize performance test scripts, defining load scenarios, running tests using Commander, capturing the results using Collector, interpreting the results, and exporting captured performance data into Excel for analysis and reporting. As with many open source tools, self-training is the rule. Support is provided not by a big vendor staff but by fellow practitioners via email. Learn how to find critical documentation that is often hidden in FAQs and discussion forum threads. If you are up to the support challenge, OpenSTA is an excellent alternative to other tools.

  • The capabilities and limitations of OpenSTA
Dan Downing, Mentora Inc
Risk-Based Testing in Practice

The testing community has been talking about risk-based testing for quite a while, and now most projects apply some sort of implicit risk-based testing approach. However, risk-based testing should be more than just brainstorming within the test team; it should be based on business drivers and business value. The Test team is not the risk owner-the products' stakeholders are. It is our job to inform the stakeholders about risk-based decisions and provide visibility on product risk status. Erik discusses a real-world method for applying structured risk-based testing applicable in most software projects. He describes how risk identification and analysis can be carried out in close cooperation with stakeholders Join Erik to learn how the outcome of the risk analysis can-and should-be used in test projects in terms of differentiated test approaches.

Erik van Veenendaal, Improve Quality Services BV
A Balanced Scorecard Approach for Assessing Test Value and Success

Internal test metrics--test progress, defect density, and TPI/TMM measures on process improvement-do not reveal the complete picture of test value and success. By comparing common test metrics with those found in the Balanced Business Scorecard--financial, customer, internal, and learning/innovation metrics-we see the need to also report financial and customer measures. Some of these measures are quantitative (such as profits), and others are more qualitative (for example, customer satisfaction). Learn to measure the financial impact of testing through productivity metrics and measures of how testing affects the total cost of quality. Include in your reporting qualitative assessments such as the customers' perception of the usefulness of testing, the visibility of testing on projects, acceptability measures, and estimation accuracy.

  • Set measures for all viewpoints of testing's value and success
Isabel Evans, Testing Solutions Group Ltd
Progressive Performance Testing: Adapting to Changing Conditions

An inflexible approach to performance testing is a prelude to disaster. "What you see at the start isn't always what you get in the end," says Jeff Jewell. Based on his experience performance testing applications on numerous consulting projects, Jeff demonstrates the challenges you may face testing your applications and how to overcome these obstacles. Examples from performance testing on these projects will demonstrate some of the ways that changing conditions of the projects and the information they discovered in early tests caused the testing approach to change dramatically. Find out how hardware configuration, hardware performance, script variations, bandwidth, monitoring, and randomness can all affect the measurement of performance.

Jeff Jewell, ProtoTest LLC
Test Metrics in a CMMI Level 5 Organization

As a CMMI® Level 5 company, Motorola Global Software Group is heavily involved in software verification and validation activities. Shalini Aiyaroo, senior software engineer at Motorola, shows how tracking specific testing metrics can serve as key indicators of the health of testing and how these metrics can be used to improve your testing practices. Find out how to track and measure phase screening effectiveness, fault density, and test execution productivity. Shalini describes the use of Software Reliability Engineering (SRE) and fault prediction models to measure test effectiveness and take corrective actions. By performing orthogonal defect classification (ODC) and escaped defect analysis, the group has found ways to improve test coverage.

CMMI® is a registered trademark of Carnegie Mellon University.

  • Structured approach to outsource testing
Shalini Aiyaroo, Motorola Malaysia Sdn. Bhd
STAREAST 2006: Apprenticeships: A Forgotten Concept in Testing

The system of apprenticeship was first developed in the late Middle Ages. The uneducated and inexperienced were employed by a master craftsman in exchange for formal training in a particular craft. So why does apprenticeship seldom happen within software testing? Do we subconsciously believe that just about anyone can test software? Join Lloyd Roden and discover what apprenticeship training is and-even more importantly-what it is not. Learn how this practice can be easily adapted to suit software testing. Find out about the advantages and disadvantages of several apprenticeship models: Chief Tester, Hierarchical, Buddy, and Coterie. With personal experiences to share, Lloyd shows how projects will benefit immediately with the rebirth of the apprenticeship system in your test team.

  • Four apprenticeship models that can apply to software testers
  • Measures of the benefits and return on investment of apprenticeships
Lloyd Roden, Grove Consultants
Using Production Failures to Jump Start Peformance Test Plans

Learning from a production system failure is not a model MassMutual Financial Group would have chosen. However, when one of their key applications failed under load in production, they turned on a dime and changed their performance testing approach, focus, and capabilities. Let’s set the scene: They ran large numbers of transactions through a performance test tool and, then, went live with a new application that was to be used by all their key users. Within hours, the application had ground to a virtual halt under normal production load. What went wrong? Join Sandra Bourgeois to find out not only what went wrong but also what they learned from failure and how they set about to improve their knowledge, skills, and tools. This is your chance to learn from their mistakes and avoid repeating them in your organization.

  • Lessons learned from the performance failure of a mission-critical application
Sandra Bourgeois, Massachusetts Mutual Life Insurance Company

Pages

AgileConnection is a TechWell community.

Through conferences, training, consulting, and online resources, TechWell helps you develop and deliver great software every day.