Conference Presentations

Testing in the Internet Time-A Case Study

Testing before eBusiness was tough-and now it is even more difficult! This presentation gives an overview of three typical eBusiness development lifecycles that exist today (in hours, weeks, and months) and offers a testing lifecycle for each. Learn of one software company's successful implementation of its Internet testing lifecycle and the benefits (numerically quantified) derived from it.

Eamonn McGuinness, aimware
Software Test Automation: Planning and Infrastructure for Success

Automation tools are often viewed as a cure-all that will instantly reduce test cost and effort. However, without up-front planning and infrastructure design, automated tests can quickly become difficult to create and maintain, and the tools nothing more than expensive shelf ware. This paper describes how to initiate a successful automation effort by developing standards and processes for automation and an infrastructure designed for success.

Bill Boehmer and Bea Patterson, Siemens Building Technologies, Inc.
STAREAST 2001: Measuring the Value of Testing

How can we make testing more visible and appreciated? Without measurement, we only have opinions. This presentation outlines simple and practical ways to measure the effectiveness and efficiency of testing, particularly the metric Defect Detection Percentage. Learn how this measure can be implemented in your organization to keep track of defects found in testing (and afterwards). Explore choices, problems, and benefits in using this measure as well as other useful measures.

Dorothy Graham, Grove Consultants
Performance Testing 101

Organizations are often so eager to "jump in" and use load testing tools that the critical steps necessary to ensure successful performance testing are sometimes overlooked-leading to testing delays and wasted effort. Learn the best practices and tips for successful automated performance testing in areas such as assembling a proper test team, planning, simulating a production environment, creating scripts, and executing load tests.

David Torrisi, Mercury Interactive
Configuration Management for Testers Working in Web Development Environments

Configuration management has long been a staple activity for large, traditional software engineering projects but has been markedly absent from most Web development projects. This presentation gives a brief overview into configuration management from a tester's perspective. Learn of the costs, drawbacks, and benefits of configuration management. Discover quick and simple ways your testing staff can add configuration management to your Web development environment.

Andrea MacIntosh, QA Labs Inc.
Automated Test Results Processing

This paper introduces techniques used to automate the results analysis process. It examines the analysis of
crash dump files and log files to extract consistent failure summaries and details, showing how these are
used in problem reporting. It then studies the practical application of Automated Test Results Processing
at Mangosoft Incorporated and presents data showing the impact this has had in product testing.

Edward Smith, MangoSoft Corporation
Using Commonly Captured Data to Improve Testing Processes

For a variety of reasons, many test organizations routinely collect data on defects found during testing,
on tests that were run, on estimated time and actual time spent testing, on code coverage, and on
customer-reported problems, among other things. Some of these data only become collectable after
formal processes are put in place, while others can be obtained with minimal effort. The subject of this
paper is to describe a case study of collecting and using the latter type of data. Four databases are used
to track defect data, log test cases, and log customer calls. These data are used to guide efforts at
improving the testing process, the test materials, and the databases themselves. For many testing
organizations, these data are already available. If not, they are easy to collect.

Dean Lapp, Minitab Inc.
Targeted Software Fault Insertion

Since the completely random software fault insertion techniques suggested in much of the research literature are not practical for most software products, this paper suggests that a modest targeted software fault insertion effort for a few common error conditions can have a dramatic impact on defect detection rates and quality. The paper uses the example of a software fault insertion subsystem, codenamed Faulty Towers, which was added to Mangosoft Incorporated’s test automation in order to target
common failures and errors.

Paul Houlihan, MangoSoft Corporation
A Senior Manager's Perspective on Software QA and Testing

Quality assurance (QA) and testing are critical to the success of any software company. However, the senior management team doesn't always understand this and needs to be educated about the world of software QA and testing. Learn how to raise the profile of QA within your organization and communicate effectively with senior management by understanding their perspective. Explore various strategies for educating and communicating with the management team.

Paul Lupinacci, Changepoint Corporation
Critical Skills and Effective Attitudes for Testers

What distinguishes good testers? Some characteristics explained in this presentation:

  • the right attitudes
  • the appropriate skills
  • continuous skills growth
Rex Black, Rex Black Consulting Services, Inc.

Pages

AgileConnection is a TechWell community.

Through conferences, training, consulting, and online resources, TechWell helps you develop and deliver great software every day.