STAREAST 2000 - Software Testing Conference

PRESENTATIONS

By
Michael Rubin, Fidelity Investments

The rapidly changing world of e-commerce applications poses a variety of challenges to software testing organizations. In this presentation, Michael Rubin uses examples from the world of online brokerage to highlight a number of issues facing testers of Web applications, including environment complexity, capacity and performance automation, time-to-market pressures, and staff retention. Over the past three years, the rapid growth of e-commerce has put these issues into the forefront for many testing organizations.

By
Terrye Ashby, Pointe Technology Group, Inc.

Although testers and Quality Assurance (QA) managers have historically found themselves to be the last hired and the first fired, the Y2K problem has brought testing and testers to the forefront of application development. Now that you have the team in place, how can you enhance your processes and demonstrate the value of your team to the rest of the world? Learn fun and practical methods to help testers create an awareness of QA's value within your organization.

By
Shel Seigel, Seigel Associates Corporation

From this presentation's summary:

  • Re-Design Testing to Provide Information to Manage Risks
  • Use Risk Analysis to Determine What & When to Test (Knowledge)
  • Design Test Activities to Provide Feedback about Risks (Information)
  • Use a Hierarchical Test Approach with Feedback
    Loops (Information+Knowledge)
  • Remember The Business Goal (Wisdom)
By
Alberto Savoia, Velogic Inc.

The magnitude and unpredictability of Internet loads pose unique testing challenges for Web sites. The only sure way to know how a Web site will behave under a heavy user load is to perform a load test. In this presentation, learn the core principles and techniques necessary for highly realistic and revealing Web site load tests. Listen to several case studies that illustrate why Web sites cannot afford to ignore load testing.

By
Brian Marick, Testing Foundations

Good testing does not come naturally to everyone. For these individuals, the best option is to look closely at really good testers and observe what often seems to come easily and unconsciously to them. Brian Marick explores how good testers make effective and efficient use of three sources of information: past bugs and their fixes; informal descriptions of the product architecture; and characterizations of the end user. Learn how good testers gather this information, and what they do with it once they have it.

By
Dominique Guilbaud, ATTOL Testware

The component testing phase remains mostly a human activity. In particular, the conception of test cases, which can be a very tedious task, is usually completely manual. Although existing tools alleviate the task of the tester and increase his or her productivity, very few actually help in the generation of input test data. Discover a new generation of tools that can automatically generate test cases parameterized with various test objectives (functional, structural, and both).

By
Jim Boone, SAS, Institute, inc.

Coverage Analysis System (CAS) data is often useful in determining that enough tests have been written, and identifying C-code lines that have no test coverage. In this presentation, Jim Boone explores various methods that use CAS data to determine the best set of automated tests to execute for a corrected defect. Learn the strengths, weaknesses, and best stage for using each method.

By
Sam Guckenheimer, Rational Software

The Unified Modeling Language (UML) has become the industry's standard for capturing software architectures and elaborating system design. This presentation provides an overview of the UML from a tester's perspective. Learn how UML represents software design, including key diagrams. Discover when these diagrams are appropriate, what information can be derived from them, and what types of software can be represented. Explore ways to use UML to facilitate communication among testers, developers, and analysts in your organization.

By
Anne Campbell, Channel Health

Anne Campbell provides insight into how a risk analysis grid was effectively implemented at IDX as a device to gauge QA team performance and as a communication tool to the development team. This valuable tool listed functionality and the risk associated with it, helping IDX focus their testing in higher risk areas. Discover how a risk analysis matrix can be used to plan your testing efforts and increase the quality of your software project.

Pages

AgileConnection is one of the growing communities of the TechWell network.

Featuring fresh, insightful stories, TechWell.com is the place to go for what is happening in software development and delivery.  Join the conversation now!