STARWEST 2000 - Software Testing Conference


Dan Downing, Mentora, Inc.

Using examples and results taken from several successful stress testing projects, Dan Downing illustrates an eight stage methodology for planning, designing, executing, and evaluating the results of an automated load test. Learn the key activities, tools, resources, costs, deliverables, techniques, and challenges for each stage of this approach. Explore load testing concepts and terminology.

Rick Smith, IBM

Software testing is becoming more involved all the time: products involve more components; automation tools are used more often; and testing is required on more than one operating system, version, or language. In his presentation, Rick Smith addresses this problem and presents a solution that is automated, flexible, efficient, and repeatable. Learn how to improve--and simplify--software testing efficiency in your organization.

Mary Sweeney, Data Dimensions, Inc.

Companies setting up test automation projects find out pretty quickly that the major automation tools on the market today cannot always accomplish what is needed to support a full and complete automation testing project. While some companies can afford to purchase multiple tools, others rely on popular programming languages. Explore how testers can use these common programming languages and techniques to build test scripts and utilities to enhance and support their automation projects.

Hung Nguyen, LogiGear Corporation

This presentation focuses on the characteristics of Web application errors to derive key issues to consider in analyzing and reproducing errors. Learn how to isolate application errors from configuration and technical support issues. Explore effective techniques to make errors reproducible. Examples of common and uncommon Web application error types are provided.

Efi Goldfarb, TopTier

How do you implement effective testing in the rapidly changing world of a start-up company developing a Web application? This presentation explores the issues and dynamics of testing a moving target within impossible deadlines, including methods and practices for building quality and structure despite the constraints. Learn how to evolve the development process and establish effective communication between the development and testing groups.

Cem Kaner, Florida Institute of Technology

Regression test automation is just one example of automated testing, and it is probably not the best one. This double-track presentation considers the problems inherent in regression automation and outlines alternatives that involve automated generation, execution, and evaluation of large numbers of tests. Explore oracle-based, high-volume comparison tests, stochastic tests, and configuration tests.

Bill Pearce, Corbel

Based on a case study, Bill Pearce presents a data-driven approach to developing automated tests for a transaction-based application. Explore the significant advantages to this approach, including reduced start-up costs of automating tests. Learn how this model will shield business-oriented testers from tool technicalities while allowing testers to create and run their own automated test scenarios and view their own test results.

Clive Bates, OCS Consulting

This presentation gives you a logical process for selecting and implementing a software test automation tool. Gain a better understanding of the benefits and limitations of test tools while learning about the misconceptions and how to avoid the pitfalls. Using real-life experiences with a London bank, a cable TV company, and a bookmaking operation, Clive Bates clarifies the myths about tool usage and implementation. Learn how to select the best test automation tool for your organization.

Kanwarpreet Singh Grewal, Cadence Design Systems

Software today is getting more and more complex. This complexity brings fresh challenges in testing the software user interface and its underlying functionality. This presentation takes a look at where "testability" fits into the overall software development lifecycle. Learn how adding testability features can improve the test coverage and automation level--resulting in a better quality product release. Explore the relationship between testability and usability in software development.

Mary Decker, Aldebaron Financial Solutions

Reporting a problem isn't enough. The more information you can provide the developer, the sooner the problem can be identified and fixed. Learn what developers need in a bug report, and how to create a good report versus a bad one. Explore the classification and severity of bugs and the importance of retesting before reporting.


AgileConnection is a TechWell community.

Through conferences, training, consulting, and online resources, TechWell helps you develop and deliver great software every day.