STARWEST 2008 - Software Testing Conference

PRESENTATIONS

STARWEST 2008: Performance Engineering: More Than Just Load Testing

Performance testing that is done once or a few times as part of the system test is not the right approach for many systems that must change and grow for years. Rex Black discusses a different approach--performance engineering--that is far more than performing load testing during the system test. Performance engineering takes a broad look at the environment, platforms, and development processes and how they affect a system's ability to perform at different load levels on different hardware and networks.

Rex Black, QA Software Consultant/Trainer

STARWEST 2008: Quality Metrics for Testers: Evaluating Our Products, Evaluating Ourselves

As testers, we focus our efforts on measuring the quality of our organization's products. We count defects and list them by severity; we compute defect density; we examine the changes in those metrics over time for trends, and we chart customer satisfaction. While these are important, Lee Copeland suggests that to reach a higher level of testing maturity, we must apply similar measurements to ourselves.

Lee Copeland, Software Quality Engineering

STARWEST 2008: Telling Your Exploratory Story

What do you say when your manager asks, "How did it go today?" As a test manager, you might say, "I'll check to see how many test cases the team executed today." As a tester with a pile of test cases on your desk, you could say, "I ran 40 percent of these tests today," or "At the rate I'm going I'll be finished with these test cases in 40 days." However, if you're using exploration as part of your testing approach, it might be terrifying to try to give a status report--especially if some project stakeholders think exploratory testing

Jon Bach, LexisNexis

STARWEST 2008: Test Estimation: Painful or Painless?

As an experienced test manager, Lloyd Roden believes that test estimation is one of the most difficult aspects of test management. You must deal with many unknowns, including dependencies on development activities and the variable quality of the software you test. Lloyd presents seven proven ways he has used to estimate test effort. Some are easy and quick but prone to abuse; others are more detailed and complex but may be more accurate.

Lloyd Roden, Grove Consultants

STARWEST 2008: The Case Against Test Cases

A test case is a kind of container. You already know that counting the containers in a supermarket would tell you little about the value of the food they contain. So, why do we count test cases executed as a measure of testing's value? The impact and value a test case actually has varies greatly from one to the next. In many cases, the percentage of test cases passing or failing reveals nothing about the reliability or quality of the software under test.

James Bach, Satisfice, Inc.

STARWEST 2008: The Marine Corps Principles of Leadership for Testers

You can have the best tools and processes in the world, but if your staff is not motivated and productive, your testing effort will be, at best, inefficient. Good test managers must also be good leaders. Retired Marine Colonel Rick Craig describes how using the Marine Corps Principles of Leadership can help you become a better leader and, as a result, a better test manager. Learn the difference between leadership and management and why they complement each other.

Rick Craig, Software Quality Engineering

STARWEST 2008: Understanding Test Coverage

Test coverage of application functionality is often poorly understood and always hard to measure. If they do it at all, many testers express coverage in terms of numbers, as a percentage or proportion-but a percentage of what? When we test, we develop two parallel stories. The "product story" is what we know and can infer about the software product-important information about how it works and how it might fail.

Michael Bolton, DevelopSense

STARWEST 2008: What Price Truth? When a Tester is Asked to Lie

As testers and test managers, our job is to tell the truth about the current state of the software on our projects. Unfortunately, in the high-stakes business of software development, often there is pressure--subtle or overt-to distort our messages. When projects are late or product reliability is poor, managers' and developers' reputations-and perhaps even their jobs-may be on the line.

Fiona Charles, Quality Intelligence Inc.

Test Automation Techniques for Dynamic and Data Intensive Systems

If you think you're doing everything right with test automation but it just won't scale, join the crowd. If the amount of data you're managing and the dynamic changes in applications and workflows keep you in constant maintenance mode, this is the session for you. Encountering these problems, Chris Condron's group reviewed their existing automation successes and pain points. Based on this analysis, they created a tool agnostic architecture and automation process that allowed them to scale up their automation to include many more tests.

Chris Condron, The Hanover Insurance Group
Test Management for Very Large Programs: A Survival Kit

In large organizations with multiple, simultaneous, and related projects, how do you coordinate testing efforts for better utilization and higher quality? Some organizations have opened Program Test Management offices to oversee the multiple streams of testing projects and activities, each with its own test manager. Should the Program Test Manager be an über-manager in control of everything, or is this office more of an aggregation and reporting function?

Graham Thomas, Independent Consultant

Pages

AgileConnection is a TechWell community.

Through conferences, training, consulting, and online resources, TechWell helps you develop and deliver great software every day.