Conference Presentations

The Software Vulnerability Guide: Uncut and Uncensored

Warning: This talk contains graphic examples of software failure . . . not suitable for the faint of heart. This "no holds barred" session arms testers with what they really need to know about finding serious security vulnerabilities. Herbert Thompson takes you on an illustrated tour of the top twelve security vulnerabilities in software and shows you how to find these flaws efficiently. Each vulnerability is brought to life through a live exploit followed by a look at the testing technique that would have exposed the bug. Testers and test managers will leave with a keen awareness of the major vulnerability types and the knowledge and insight to fundamentally improve the security of the applications they support and test.

Herbert Thompson, Security Innovation LLC
STAREAST 2006: Testing Dialogues - Management Issues

As a test manager, are you struggling at work with a BIG test management issue or a personnel issue? If so, this session is for you. "Testing Dialogues--Management Issues" is a unique platform for you to share with and learn from test managers who have come to STAREAST from around the world. Facilitated by Esther Derby and Johanna Rothman, this double-track session takes on management issues--career paths for test managers, hiring, firing, executive buy-in, organization structures, and process improvement. You name it! Share your expertise and experiences, learn from others’ challenges and successes, and generate new topics in real time. Discussions are structured in a framework so that participants will receive a summary of their work product after the conference.

Facilitated by Esther Derby and Johanna Rothman
Testing: The Big Picture

If all testers put all their many skills in a pot, surely everyone would come away with something new to try out. Every tester can learn something from other testers. But can a tester learn something from a ski-instructor? There is much to gain by examining and sharing industry best practices, but often much more can be gained by looking at problem solving techniques from beyond the boundaries of the Testing/QA department. Presented as a series of analogies, Brian Bryson covers the critical success factors for organizations challenged with the development and deployment of quality software applications. He takes strategies and lessons from within and beyond the QA industry to provide you with a new perspective on addressing the challenges of quality assurance.

Brian Bryson, IBM Rational Software
Build Rules: A Management System for Complex Test Environments

Due to the interaction of many software components, there is increased complexity in testing today's software solutions. The problem becomes especially difficult when the solution includes combinations of hardware, software, and multiple operating systems. To automate this process, Steven Hagerott's company developed "Build Rules," a Web-based application with inputs from their build management and test execution systems. Using logical rules about the builds, test engineers define the characteristics of the build solution points. To deliver the latest and greatest builds that meet the characteristics defined for each solution point, the system dynamically translates these rules into server side nested SQL queries. Learn how their efficiency and accuracy has improved significantly, allowing test engineers to stay on track with many different build combinations and to communicate results to outside departments and customers.

Steve Hagerott, Engenio Storage Group, LSI Logic Corporation
Progressive Performance Testing: Adapting to Changing Conditions

An inflexible approach to performance testing is a prelude to disaster. "What you see at the start isn't always what you get in the end," says Jeff Jewell. Based on his experience performance testing applications on numerous consulting projects, Jeff demonstrates the challenges you may face testing your applications and how to overcome these obstacles. Examples from performance testing on these projects will demonstrate some of the ways that changing conditions of the projects and the information they discovered in early tests caused the testing approach to change dramatically. Find out how hardware configuration, hardware performance, script variations, bandwidth, monitoring, and randomness can all affect the measurement of performance.

Jeff Jewell, ProtoTest LLC
Test Metrics in a CMMI Level 5 Organization

As a CMMI® Level 5 company, Motorola Global Software Group is heavily involved in software verification and validation activities. Shalini Aiyaroo, senior software engineer at Motorola, shows how tracking specific testing metrics can serve as key indicators of the health of testing and how these metrics can be used to improve your testing practices. Find out how to track and measure phase screening effectiveness, fault density, and test execution productivity. Shalini describes the use of Software Reliability Engineering (SRE) and fault prediction models to measure test effectiveness and take corrective actions. By performing orthogonal defect classification (ODC) and escaped defect analysis, the group has found ways to improve test coverage.

CMMI® is a registered trademark of Carnegie Mellon University.

  • Structured approach to outsource testing
Shalini Aiyaroo, Motorola Malaysia Sdn. Bhd
STAREAST 2006: Apprenticeships: A Forgotten Concept in Testing

The system of apprenticeship was first developed in the late Middle Ages. The uneducated and inexperienced were employed by a master craftsman in exchange for formal training in a particular craft. So why does apprenticeship seldom happen within software testing? Do we subconsciously believe that just about anyone can test software? Join Lloyd Roden and discover what apprenticeship training is and-even more importantly-what it is not. Learn how this practice can be easily adapted to suit software testing. Find out about the advantages and disadvantages of several apprenticeship models: Chief Tester, Hierarchical, Buddy, and Coterie. With personal experiences to share, Lloyd shows how projects will benefit immediately with the rebirth of the apprenticeship system in your test team.

  • Four apprenticeship models that can apply to software testers
  • Measures of the benefits and return on investment of apprenticeships
Lloyd Roden, Grove Consultants
ISQTB Certification: Setting the Standard for Tester Professionalism

Sandra Bourgeois has 25 years experience as an IT professional project manager, test manager, developer and QA lead. She is a Director and Project Manager at MassMutual
Financial Services in Springfield, Mass. For the past three years she has functioned as the senior IT Test Manager at MassMutual, working with a variety of large projects to
identify and resolve testing roadblocks to project implementation. She also serves as a project manager and teaches classes on testing topics, focusing on Performance
Testing. Most recently she has been the Performance Test Manager for several critical projects including Internet, Intranet and technical upgrades. She has presented at the
QAI International Conference. Her background also includes working as a social studies teacher and museum tour guide since graduating from Vassar College and UCLA.

Rex Black, Rex Black Consulting
Using Production Failures to Jump Start Peformance Test Plans

Learning from a production system failure is not a model MassMutual Financial Group would have chosen. However, when one of their key applications failed under load in production, they turned on a dime and changed their performance testing approach, focus, and capabilities. Let’s set the scene: They ran large numbers of transactions through a performance test tool and, then, went live with a new application that was to be used by all their key users. Within hours, the application had ground to a virtual halt under normal production load. What went wrong? Join Sandra Bourgeois to find out not only what went wrong but also what they learned from failure and how they set about to improve their knowledge, skills, and tools. This is your chance to learn from their mistakes and avoid repeating them in your organization.

  • Lessons learned from the performance failure of a mission-critical application
Sandra Bourgeois, Massachusetts Mutual Life Insurance Company
S-Curves and the Zero Bug Bounce: Plotting the Way to Better Testing

The use of objective test metrics is an important step toward improving your ability to effectively manage any test effort. With the two test metrics-the S-Curve and Zero Bug Bounce-you can easily track the progress of the test effort. Learn to graph the S-Curve, showing cumulative test cases planned, attempted, and completed over time. Keep track of the Bug Bounce-the number of open bugs at the end of a period (usually one to several days)-and especially Zero Bug Bounce-the first time development has resolved all the bugs raised by the testers and there are no active outstanding issues. Improve your ability to communicate to the project team test results and test needs and make better decisions about when your application is ready to ship.

  • Derive a theoretical and actual S-Curve for test cases using historic and current data
  • Use the Zero Bug Bounce for tracking defect correction activities
Shaun Bradshaw, Questcon Technologies, A Division of Howard Systems Intl.

Pages

AgileConnection is a TechWell community.

Through conferences, training, consulting, and online resources, TechWell helps you develop and deliver great software every day.