load testing

Conference Presentations

Design Testability and Service Level Measurements into Software

Design and architecture decisions made early in the project have a profound influence on the testability of an application. Although testing is a necessary and integral part of application development, architecture and design considerations rarely include the impacts of development design decisions on testability. In addition, build vs. buy, third party controls, open source vs. proprietary, and other similar questions can affect greatly the ability of an organization to carry out automated functional and performance testing-both positively and negatively. If the software or service is delivered to a separate set of end-users who then need to perform testing activities, the problems compound. Join Jay Weiser to find out about the important design and architecture decisions that will ensure more efficient and effective testability of your applications.

Jay Weiser, WorkSoft
Customer Focused Business Metrics throughout the SDLC

Focusing on the customer throughout the software development lifecycle (SDLC) is difficult to do. Teams often can become mired in technical problems, internal resource limitations, or other issues. Following the customer mantra of "Faster! Better! Cheaper!" Steve Wrenn offers measurement and process techniques that he has used to deliver projects on time, on budget, and, most importantly, meeting customers needs. By focusing on the development cycle from the outside in, his organization provides business-based metrics dashboards to monitor and adjust the project plan throughout the development project. Find out how their performance dashboard helps the team and the customer stay on course and drive directly to the targeted results. Discover an approach to determine what customers really want and match product development to customer expectations.

Steve Wrenn, Liberty Mutual Insurance Information Systems
Software Test Automation Spring 2003: Mission Made Possible: A Lightweight Test Automation Experience

Using a challenging client engagement as a case study, Rex Black shows you how he and a team of test engineers created an integrated, automated unit, component, and integration testing harness, and a lightweight process for using it. The test harness supported both static and dynamic testing of a product that ran on multiple platforms. The test process allowed system development teams spread across three continents to test their own units before checking them into the code repository, while the capture of the tests provided automated integration testing and component regression going forward. He'll also explain the tools available to build such a testing harness and why his team chose the ones they did.

  • Examine the benefits-and challenges-of implementing an integrated, automated component and integration testing process in a Java/EJB development environment
Rex Black, Rex Black Consulting Services, Inc.
Testing Toolkit for J2EE Systems: A Case Study

Taking a test team from a client/server environment to J2EE-based Web technologies and implementing test automation at the same time is a challenge. Introducing an agile test methodology into a traditionally waterfall-oriented organization at the same time is even bigger. In this case study, share Clay Coleman's successes and challenges as he mentored and supported a test group throughout this project. Walk with Clay from the days of early analysis and design; through test strategy development and planning; on to test case design and automation efforts; during all stages of test execution; past system rollout; and, finally, completion of an initial regression test suite. If you think you may go through such an experience, you'll learn some lessons Clay will never forget.

  • Integrate test automation into the construction phase of a development project
Clay Coleman, CapTech Ventures
Testing "Best Practices": From Microsoft's Context to Yours

Testing is a never-ending series of trade-off decisions, what to test and what not to test; when to stop testing and release the product; how to budget your testing resources for automated vs. manual testing; how much code coverage is good enough; and much more. To make these difficult judgement calls, we often turn to the "best practices" recommended by testing experts and others who have encountered similar problems. The key to successful implementation is matching their "best practices" to your own context (team make-up, company culture, market
environment, etc.). Barry Preppernau shares his insights gathered from over 20 years of testing experience at Microsoft. You'll learn about the tools and processes that have been successful within Microsoft and ways for you to identify, adapt, and implement successful test improvement
initiatives within your organization.

Barry Preppernau, Microsoft Corporation
Getting Things Done: Practical Web Application/e-Commerce Stress Testing

Web and e-commerce applications are still the rising, often unreachable, stars of the testing world. Your team's ability to effectively stress test Web applications-before your customers do-is critical. This double-track session shows you the tools that support stress testing, including several that cost absolutely nothing. It also walks you through a variety of approaches to stress testing that are available during all phases of development. This journey allows you to develop a plan to automate your stress testing, as well as know how and when to implement it as part of the software development process.

Robert Sabourin, AmiBug.com Inc
Testing Component-Based Software

Today component engineering is gaining substantial interest in the software engineering community. Jerry Gao provides insight and observations on component testability and proposes a new model to represent and measure the maturity levels of a component testing process. In this presentation, you will identify, classify, and discuss new issues in testing component-based software.

Jerry Gao, San Jose State University
Modeling the Real World for Load Testing

Requesting your Web site's home page one hundred times per minute is not going to give you a very accurate idea of how your Web site is actually going to perform in the real world. Explore the variables that you need to consider when designing a Web load or stress test, including user activities, graphics, security, user access speeds, and geographic locations.

Steve Splaine, Splaine & Associates
Managing User Acceptance Testing in Large Projects

Managing user acceptance testing poses many challenges, especially in large-scale projects. Julie Tarwater explores the issues of planning, coordinating, and executing effective user testing with a large number of end users. Learn strategies for ensuring user acceptance while exploring the pros and cons of each. Discover ways to prioritize issues that arise from user testing.

Julie Tarwater, T. Rowe Price Associates
Stress Testing Load on a Server

Everyone is talking about the need to "load test" their servers--whether Web servers or proprietary server software--but how do the people who must load test these servers actually perform the testing? Learn how different companies load test their servers by simulating thousands of connections using a combination of publicly available tools and proprietary utilities.

Elisabeth Hendrickson, Aveo Inc.

Pages

AgileConnection is a TechWell community.

Through conferences, training, consulting, and online resources, TechWell helps you develop and deliver great software every day.