Too often, a neophyte test planner is overwhelmed by the complexity and scope of test planning. Kathy Iberle presents several practical methods to divide the testing into manageable pieces. Based on her experiences as a test planner for a wide range of software applications, she offers guidance on which methods work best in what circumstances.
STARWEST 1999 - Software Testing Conference
Testing finds the most critical, customer-impacting bugs when performed under conditions that simulate the deployed environment. Based on his experiences as a test manager, Rex Black discusses planning and managing the logistics of complex, multi-site test environments. Learn how to use a simple database to plan, track, and communicate the hardware, software, human, and configuration implications of executing test suites.
Drawing on his extensive testing experience at Ameritech, Dan Guss discusses considerations for building, managing, and coordinating the development of new or existing test environments and labs for large systems with an emphasis on the entire business solution. Examine the vision, components, approach and methodology, and business processes that are critical to a successful test environment.
In addition to classifying and quantifying defects, a key element of test metrics reporting is a program to monitor and track the effort to develop automated tests. Learn how to define and capture metrics that will provide an objective report on the quality of software under test, as well as the status of the automation project itself. Learn to avoid the estimating games such as "The Price is Right" and "Double Dummy Spit."
In this practitioner-oriented talk, Cem Kaner uses observations and anecdotes from his experiences to broaden and diversity your approach to the design and development of the class of tests called "Black Box." Listen as he explores the conceptual differences among significantly different approaches to testing at the functional and system level without using knowledge of the program code. Examine examples that will guide your thinking about testing within a particular paradigm.
Based on his testing experiences, Mike Lee illustrates through practical examples the steps taken to dramatically raise the probability of a successful test improvement implementation. Learn how major benefits were achieved in a complex environment with a very limited timeframe through changes in test strategy, administration support, agreement of ownership, and the establishment of basic metrics. Then, you too can answer senior management's question.
Test automation raises our hopes yet often frustrates and disappoints us. Although automation promises to deliver us from a tough situation, automating implemented tests can create as many problems as it solves. The key is to follow the rules of software development when automating testing. Bret Pettichord presents seven key steps: improve the testing process, define requirements, prove the concept, champion product testability, design for sustainability, plan for deployment, and face the challenges of success.
A software inspection is a well-known method in the industry today to improve the quality of software that we produce. Examine the problems that Intel Corporation faced with implementing this process and how they overcame the issues to see some very good results--ultimately attaining closure with 96% of their inspections.
Whether you are testing an Internet, intranet, or extranet application, testing for the Web can be more challenging than non-Web applications. In addition to the areas normally covered in non-Web applications, you usually have new challenges in the areas of compatibility and security. Discover the common areas where Web errors occur and learn ways to test for those errors. Learn new ideas and techniques to apply in your own Web projects.