STARWEST 1999 - Software Testing Conference


David Petrella, Keane, Inc.

David Petrella describes a real-life situation where Test was part of the Development team and included in all phases of the development lifecycle, resulting in a highly successful software project. Here, he discusses the strategy, issues, difficulties, and successes of working as an integrated team to develop software. Learn how challenging the established process can pay off in your organization.

Linda Malchow, Shell Services International

Many companies make the assumption that Enterprise Resource Planning (ERP) systems need less testing because they are, after all, purchased software. Unfortunately, the systems are highly configurable and changes in one module can have drastic effects in another module. Based on her experiences, Linda Malchow describes the test strategy and results of a $115 million SAP project and explores the lessons learned.

John Romanak, Telcordia Technologies, Inc.

Few organizations have achieved a Level 5 rating on the SEI's Capability Maturity Model (CMM). John Romanak describes Telcordia's experience in implementing a CMM Level 5 process with proven benefits and its impact on testing. Learn about the major testing challenges and enormous paradigm shifts that took place at Telcordia (formerly Bellcore) in their journey to CMM Level 5.

Kathy Iberle, Hewlett-Packard

Too often, a neophyte test planner is overwhelmed by the complexity and scope of test planning. Kathy Iberle presents several practical methods to divide the testing into manageable pieces. Based on her experiences as a test planner for a wide range of software applications, she offers guidance on which methods work best in what circumstances.

Rex Black, Rex Black Consulting Services, Inc.

Testing finds the most critical, customer-impacting bugs when performed under conditions that simulate the deployed environment. Based on his experiences as a test manager, Rex Black discusses planning and managing the logistics of complex, multi-site test environments. Learn how to use a simple database to plan, track, and communicate the hardware, software, human, and configuration implications of executing test suites.

Dan Guss, Ameritech

Drawing on his extensive testing experience at Ameritech, Dan Guss discusses considerations for building, managing, and coordinating the development of new or existing test environments and labs for large systems with an emphasis on the entire business solution. Examine the vision, components, approach and methodology, and business processes that are critical to a successful test environment.

Jeff Gainer, Merant, Inc.

In addition to classifying and quantifying defects, a key element of test metrics reporting is a program to monitor and track the effort to develop automated tests. Learn how to define and capture metrics that will provide an objective report on the quality of software under test, as well as the status of the automation project itself. Learn to avoid the estimating games such as "The Price is Right" and "Double Dummy Spit."

Cem Kaner, J.D., Ph.D.

In this practitioner-oriented talk, Cem Kaner uses observations and anecdotes from his experiences to broaden and diversity your approach to the design and development of the class of tests called "Black Box." Listen as he explores the conceptual differences among significantly different approaches to testing at the functional and system level without using knowledge of the program code. Examine examples that will guide your thinking about testing within a particular paradigm.

Mike Lee, CPT Consulting

Based on his testing experiences, Mike Lee illustrates through practical examples the steps taken to dramatically raise the probability of a successful test improvement implementation. Learn how major benefits were achieved in a complex environment with a very limited timeframe through changes in test strategy, administration support, agreement of ownership, and the establishment of basic metrics. Then, you too can answer senior management's question.

Bret Pettichord, Tivoli Systems

Test automation raises our hopes yet often frustrates and disappoints us. Although automation promises to deliver us from a tough situation, automating implemented tests can create as many problems as it solves. The key is to follow the rules of software development when automating testing. Bret Pettichord presents seven key steps: improve the testing process, define requirements, prove the concept, champion product testability, design for sustainability, plan for deployment, and face the challenges of success.


AgileConnection is a TechWell community.

Through conferences, training, consulting, and online resources, TechWell helps you develop and deliver great software every day.