Conference Presentations

Software Test Automation Spring 2003: Mission Made Possible: A Lightweight Test Automation Experience

Using a challenging client engagement as a case study, Rex Black shows you how he and a team of test engineers created an integrated, automated unit, component, and integration testing harness, and a lightweight process for using it. The test harness supported both static and dynamic testing of a product that ran on multiple platforms. The test process allowed system development teams spread across three continents to test their own units before checking them into the code repository, while the capture of the tests provided automated integration testing and component regression going forward. He'll also explain the tools available to build such a testing harness and why his team chose the ones they did.

  • Examine the benefits-and challenges-of implementing an integrated, automated component and integration testing process in a Java/EJB development environment
Rex Black, Rex Black Consulting Services, Inc.
Reduce Risk Using Security QA Automation Techniques

Security QA testing is still in its infancy, yet the number of vulnerabilities found in applications is increasing-up by 75 percent in 2001 according to Gartner Group. Although software teams are learning about the types of coding and configuration errors that expose vulnerabilities in an application, a comprehensive QA methodology must be applied to reduce security risk. This means testers need a security policy that can serve as the basis for automated tests. Security experts can define these policies, but testers need to know how to effectively run the security tests in an automated environment to locate vulnerabilities, evaluate their results, and enter bugs for failed tests in a defect tracking system. By automating security tests, organizations can significantly reduce risk and maximize existing resource productivity.

  • Reduce the cost of development by finding security holes early in the cycle, before release
Alexander Mouldovan, Cenzic Inc
In the Beginning ..Testing Web Services (.NET and Otherwise)

A Web service provides an interface for sending and receiving information, but it doesn't have a user interface. Instead, everything is done via requests and methods. So how does one go about testing such interfaces? Programmatically, that's how. In this presentation you'll be introduced to the concept of Web services and how they work. Tom Arnold even walks you through how to create tests using Perl, Python, and VB-like languages. Anyone new to Web services testing is certain to find this presentation a crucial first step to getting started down the right path.

  • Learn how to work with a Web service interface
  • Obtain approaches to writing scripts to exercise a Web service's API
  • Look at a completed harness for testing Web services
Thomas Arnold, Xtend Development, Inc.
Assessing Automated Testing Tools: A "How To" Evaluation Approach

You've been assigned the task of evaluating automated testing tools for your organization. Whether it's your first experience or you're looking to make a change, selecting the "best" automated testing tool can be a daunting task. With so many toolsets available, we sometimes make decisions that don't provide appropriate functionality. This presentation takes you through a number of steps that should be understood--and addressed--prior to acquiring any regression or performance-based toolset.

  • Learn to correlate your organization's requirements and existing framework with the toolsets available
  • Examine how integrated components help to identify potential problems
  • Determine what to ask/require from each vendor before committing to a purchase
Dave Kapelanski, Compuware Corporation
Using Test Oracles in Automation

Software test automation is often a difficult and complex process. The most familiar aspects of test automation are organizing and running of test cases and capturing and verifying test results. A set of expected results are needed for each test case in order to check the test results. Verification of these expected results is often done using a mechanism called a test oracle. This talk describes the use of oracles in automated software verification and validation. Several relevant characteristics of oracles are included with the advantages, disadvantages, and implications for test automation.

  • Learn why evaluation of automated test results are not easy
  • Use test oracles as critical factors in making useful automated tests
  • Learn useful models for automated tests and test oracles
  • Learn five strategies for automated test oracles
  • See examples where different oracles have been used
Douglas Hoffman, Software Quality Methods LLC
Automated Database Testing: Testing and Using Stored Procedures for Testing

Today's heterogeneous data environments place a heavy burden on test engineers. Applications must be tested for seamless interface with the back-end databases, but often this goes beyond what popular test automation tools can provide. Testers must know how to create and use SQL, stored procedures, and other database objects to effectively test today's data driven environments. This presentation delivers techniques for creating efficient automated tests of the critical database back end using simple scripting languages and relational database objects. It includes specific procedures, queries, views, and other relational database objects that are valuable for typical testing situations, and demonstrates how these automated tests can be used in conjunction with popular testing tools.

  • Learn about the testing of database objects and stored procedures
Mary Sweeney, Sammamish Software Services
Avoiding Test Automation Pitfalls and Guaranteeing Return on Investment

Companies that have attempted to implement test automation for functional testing have discovered-usually the hard way-that it isn't easy. This presentation takes an in-depth look at the specific pitfalls companies encounter when implementing automated functional testing, and offers real-world, proven best practices to avoid problems and guarantee long-term success. You'll learn about an ROI model successfully used by companies for automated testing efforts, and learn to identify the key areas upon which to focus your test automation.

  • Utilize an ROI model for test automation planning and results measurement
  • Tips on how to avoid test automation failure
  • Find out when not to automate
Jeff Tatelman, TurnKey Solutions Corp
A Formula for Test Automation Success: Finding the Right Mix of Skill Sets and Tools

Not sure what elements to consider now that you're ready to embark on the mission of automating your testing? This session explores the possibilities-the key mix of skill sets, processes, and tools-that can make or break any automation effort. The instructor shows you how to develop an informed set of priorities that can make all the difference in your effort's success, and help you avoid project failure.

  • Create better, more reusable tests to improve efficiency and effectiveness
  • Increase the value and reputation of QA within your organization
  • Establish a closer relationship with developers based on mutual respect
Gerd Weishaar, IBM Rational software
Application Performance and Reliability Management - 24x7

Managing system performance and reliability has never been as significantx0151or as challengingx0151as it is now. These days, most organizations have multi-technology, multi-vendor, multi-tier environments. In other words, it’s a world rife with 24-hour, alwaysx0151on complexity. Add to this the need for continual changes to react to shifts in business conditions, technology advances, and mixes of demands and you have a recipe that calls for the highest level of performance and reliability possible. But getting there is next to impossible. However, new concepts emerging from research labs are delivering usable products such as flexible computing, autonomous computing, and self-tuning systems. These possibilities have revolutionary potential for performance management.

  • Examine recommended suites of tools and their limitations
  • Look at the major innovations and trends, such as self-tuning systems
Ross Collard, Collard and Company
Just Enough Software Test Automation

To answer the question ''How much test automation is enough?'' we have to look at the areas of the software testing process that can be automated, followed by the areas that should be automated, as well as what levels and types of tests will be automated.

Daniel Mosley, Daniel Mosley & Associates

Pages

AgileConnection is a TechWell community.

Through conferences, training, consulting, and online resources, TechWell helps you develop and deliver great software every day.