Dion Johnson use the martial arts metaphor four common issues with automated tests and how test automation specialiasts can "train" their scripts to identify, capture, and handle these problems. In this week's column, Dion talks about how to make develop test automation scripts that handles dynamic paths within an application—which he call "strikes."
This article covers a code review process for a typical environment where nobody has time and everybody want to protect his turf.
As business activities are becoming more mission-critical, it's becoming more important to adopt testing methods that support business objectives. In response, testers are coming up with new testing methodologies and road maps to exceed the expectations of the demands. Automation is one such dimension that help us to migrate up the value-add chain with our customers. The authors of this paper attempt to identify the potential areas of automation and define a framework for efficient test automation.
Performance tuning is often a frustrating process, especially when you remove one bottleneck after another with little performance improvement. Danny Faught and Rex Black describe the reasons why this happens and how to avoid getting into that situation. They also discuss why you can't work on performance without also dealing with reliability and robustness.
The Done State practice is a definition that a team agrees upon to nonambiguously describe what must take place for a requirement to be considered complete. The done state is the goal of every requirement in an iteration. It is as close as possible to deploying software as a team can come.
Rapid prototyping and development techniques combined with Agile development methodologies are pushing the envelope on the best practice of testing early and testing often. Keeping pace with the quick development turn-around and shorter time to market and being adaptive to late changes in requirements requires effective management of quality process. The use of traceability of test artifacts-test cases, test defects, test fixtures-mapped to the requirements-needs, features, use cases and supplementary requirements-as a QA scheduling and planning tool though mentioned in passing and claimed to have been practiced, has been largely overlooked by the industry. This paper looks into such a possibility through the use of a study for software that involves iterative application development practices and tries to bring this aspect of the technique into focus as a QA management tool.
Easy Assessment Technique (EAT) provides insight for rating an application during its development as well as its testing phase. It is a simple technique that allows the developers and testers to measure the quality of the product at its various levels of Development and Testing, in a normal scale of 1-10. This spot out the areas in which the Product or the Application needs concentration, with a complete report. As its denouement, the Developers can rectify the faults and the Testers can focus with a better vision for their next phase of Testing.
Goals and requirements drive the work schedules of all projects. Some of these requests are necessary to the success of the current project, others are not so critical. Yet sometimes we lose sight of this and spend many work hours trying to complete more than what can be done within the timeframe of a project. In this week's column, Johanna Rothman reminds us to look critically at what we're working on to make sure we're satisfying the goals after the requirements have been satisfied.
Technological debt is mistakenly thought of as a technical problem, but when system design cannot change according to the needs of the business, it becomes a business problem. Big Design Up-Front leads to technological debt. Architecture must be allowed to emerge according to the needs of the product and the business. We know iterative, emergent development works; iterative, emergent design is no different. Agile teams should use Retrospectives as a tool to determine current needs and enable emergent design.
Within the Agile community retrospectives are widely seen as the mechanism for promoting learning and change. But many teams fail to hold retrospectives, or fail to act on the findings, thus they fail to learn and improve. If we are going to fix this we need to change our approach to retrospectives, and find new ways to learn and create change.
Recommended Web Seminars
Agile Connection is one of the growing communities of the TechWell network.
|Featuring fresh, insightful stories, TechWell.com is the place to go for what is happening in software development and delivery. Join the conversation now!|