STARWEST 2004 - Software Testing Conference


Conveying Performance Test Results with Data Visualization

Performance and load testing generate barrels of data about transaction times, hardware resource utilization, and system behaviors. Your job as a performance engineer is not only to know how to use automated tools to create this data, but also to summarize, interpret, draw conclusions, and effectively communicate the results of tests. Do your job well, and management will be compelled to act on your recommendations.

Dan Downing, Mentora Inc
Evaluating Requirements for Testability

For a test engineer, perhaps the most important measure of requirements quality is testability. By improving testability during requirements development, you not only will make test design easier, but you also will have gone a long way toward building better software for less cost.

Rodger Drabick, Lockheed Martin Transportation & Security Solutions
Free Test Tools are Like a Box of Chocolates

You never know what you are going to get! Until you explore, it can be hard to tell whether a free, shareware, or open source tool is an abandoned and poorly documented research project or a robust powerhouse of a tool. In this information-filled presentation, Danny Faught shows you where open source and freeware tools fit within the overall test tool landscape. During this double session, Danny installs and tries out several tools right on the spot and shares tips on how to evaluate tools you find on the Web.

Danny Faught, Tejas Software Consulting
Get Your Testing Message Across

We all know how important test progress (or lack of) is to the success of the project. But why is it that sometimes no one takes notice? Valuable test reports provide information that is needed, not just easy to gather. Test progress reports aid management in decision-making and risk assessment and help testing teams set priorities. In this presentation, Isabel Evans asks, "Do our reports add value for their audience or are we just supplying 'chart junk' that will not be read?

Isabel Evans, Testing Solutions Group Ltd.

Gotcha!...Security Testing for Mission Critical Applications

A local television station provides a Web service that allows schools and businesses in the area to easily enter information on closures due to bad weather. The information then is displayed as a crawl along the bottom of the television screen. Some kids hack into the site and declare their school closed for the day, and it's immediately shown on everyone's television! It's a cute story. Now let's imagine that these same kids hack the prices on your eCommerce site or obtain access to sensitive customer records on your company Web site.

Michael Andrews, Florida Institute of Technology
How GM Tests Web Services

General Motors is on the road to a Service Oriented Architecture (SOA) as its computing standard. To proceed they need to understand the scalability and throughput limits of the message-centric Web services approach that is the essence of SOA. General Motors chose SOA to build its next generation information systems, using Web services and ebXML technology.

Frank Cohen, PushToTest
I've Looked at Bugs from Both Sides Now

Many QA and test professionals are working more closely than ever with their development counterparts, especially those using agile approaches. In doing so, we are learning that some of the attitudes and habits we developed as independent QA groups are no longer effective, especially those that perpetuate an "us vs. them" mentality.

Elisabeth Hendrickson, Quality Tree Software, Inc.

Improving Testing with Process Assessments

Fast development cycles, distributed architectures, code reuse, and developer productivity suites make it imperative that we improve our software test efficiency. A process assessment is one approach to begin an improvement program.

  • What process assessments are available?
  • How do you conduct an assessment?
  • How do you guard against incorrect information?
  • How do you know what to improve first?
  • How can you make successful improvements without negatively impacting your current work?
Robert Topolski, Intel Corporation
Introduction to Test-Driven Development

Write the test. Make it green. Make it clean. This is the mantra of test-driven development (TDD). Though viewed as a developer-only practice, software project managers, test managers, and testers need to understand TDD if they are going to operate successfully in a TDD environment. Because developers maintain a continuously updated automated test suite with TDD, testers are liberated to focus on higher level testing activities.

Christian Sepulveda, Covexus, Inc.
Lightweight .NET User Interface Testing

The .NET environment provides a surprising but little known way to create user interface (UI) test automation scripts. By employing objects in the System.Threading and System.Reflection namespaces, test engineers can write ad hoc automated UI test scenarios in minutes. James McCaffrey presents an example of a Windows-based application and creates a test program written in C# that verifies UI functionality by simulating user typing and clicking.

James McCaffrey, Volt Information Sciences, Inc.


AgileConnection is a TechWell community.

Through conferences, training, consulting, and online resources, TechWell helps you develop and deliver great software every day.