Non-functional requirements present unique challenges for authors, reviewers, and testers. They often begin as vague concepts such as "The software must be easy to install" or "The software must be intuitive and respond quickly." As written, these requirements are not testable because they are subjective. Definitions of "easy", "intuitive" and "quickly" are open to interpretation and dependent on the experiences of the reader. In order to be testable, non-functional requirements must be quantifiable and measurable. John Terzakis discusses the subjectivity problems with non-functional requirements-weak words, ambiguity, and unbounded lists. To facilitate the development of quantifiable and testable non-functional requirements, he introduces a solution-Planguage-and its associated keywords.
Would you tell your publisher to stop editing in the middle of your manuscript and publish your novel now? Of course not! Then, why would you tell your QA/test team to stop identifying problems with requirements documentation? All deliverables—and especially requirements—should undergo a rigorous assessment based on their quality attributes and measurable evaluation criteria. Mark Haynes describes quality models and attributes he has used to evaluate requirements documents. He shows how you can detect imprecision-that will haunt you later-and remove it with a set of formal criteria and informal heuristics. Discover how you can use quality attributes, even subjective ones, to conduct a quality dialogue within the development team. Mark shares examples of poorly written requirements for you to evaluate and try your hand at improving.
Ever been in a situation where everybody on the design and development team thinks they agree on the requirements-until you try to design that first screen or page? Wonder how exactly the same product requirements documentation can mean completely different things to different people? Ever heard users say of a new feature, “This is not what I expected or need”? To minimize the risks of these things happening again, join Scott Plewes and learn to create concept designs early in the project-during requirements elicitation. As a bonus, concept designs help to involve sales, marketing, technology, and other parts of the product team when you are eliciting requirements. Scott explains the risks as well as benefits of concept designs and describes how to gather and share feedback on them to gain the most value from this approach.
Have you ever heard something like “This feature (system) doesn't look anything like what I expected and doesn't do what I want”-even though you were confident that you knew what your customers wanted? Jennifer Bonine presents a strategy for involving customers up front and throughout the requirements definition process. She discusses how you can engage customers and set their expectations for involvement in the project before it begins. Jennifer outlines how to identify customer groups to engage in a joint IT–Business software requirements process in which the customers and the technology team work as peers. She describes the skill sets required for developers participating in this process and explores the reality that not every developer will be able to or want to transition to this model.
Nonfunctional requirements are an essential part of a holistic understanding of system requirements; yet many teams struggle with them. Some neglect nonfunctional requirements during requirements analysis, considering them as less important or supplemental. Or they specify them incompletely and with un-testable attributes. Analysts, developers, and business customers often struggle with when and how to define and document nonfunctional requirements. However, if these requirements are not defined with precision, you will probably not build the right product. Ellen Gottesdiener guides you through the different types of nonfunctional requirements: quality attributes, external interfaces, and design and implementation constraints. She surveys practical techniques to represent quality attributes and how to define their acceptance criteria.
Business rules and data needs are essential ingredients for a balanced set of requirements and are vital for successful product delivery. When analyzing requirements, some teams focus on business rules at the expense of data. Others dive deeply into modeling data, skimming over or skipping business rules. When it comes to delivery, if either data or rules are inadequately elaborated, you risk not delivering the right requirements and wasting precious time and money. Mary Gorman demonstrates how to leverage behavioral analysis models to discover and detail business rules and data in tandem. After identifying four types of business rules, she describes when to start modeling rules and data together, whom to involve, the skills you need, and the level of detail to explore.
While agile and traditional software development methodologies differ in many key practices, they share the mutual goal of accurately representing customer needs through requirements-whatever their form. If requirements are poorly understood, lack key details, or are interpreted differently by stakeholders, unnecessary defects will be introduced into the product. Defects lead to rework which ultimately delays product delivery and decreases customer satisfaction. John Terzakis outlines proven practices, templates, and techniques you can use for writing requirements that more accurately capture customer needs. He discusses issues with “natural language,” shares ten attributes of well-written requirements, and explores ambiguity checklists. John presents practical templates and syntaxes for stories (aka scenarios), use cases, and traditional requirements documentation.
What do users really need? Do they really know what they need? Although developers and testers are expected to implement stories and requirements that add real value, users often describe wants rather than needs and ask for features rather than solutions. Rob Sabourin shares his experiences applying task analysis using the “critical incident method” to better understand user processes and determine needs and desired solutions. Rob does not ask “what the system should do for the user” but rather learns “what the user does with the system.” The critical incident task analysis method is a fast and systematic way to study user experiences and analyze business needs. From brilliant successes and dismal failures we learn to identify and understand the user process.
Organizations that want to automate their testing generally go through a number of stages before they reach maturity. Whether you are about to begin your journey or are well under way, it is important to know where you are going and where you could go. In automating test execution, many organizations stop short of achieving their maximum benefits. This presentation looks at six levels of maturity in test automation and includes a self-assessment test to see where you are. It is important to have good objectives and realistic plans to achieve them. But in automating testing, these often seem very plausible at first but are not well expressed or are unrealistic. This presentation covers typical problems and examples of unrealistic automation plans and objectives. Leave with advice to help you have a successful journey to test automation.
Use cases are hard to test because they do not have a standard format or style, and lack coherent structure. This is due in part to the need for a standard definition in UML, which defines the graphical part but not the textual part. In this session, Jim Heumann pinpoints the issues related to testability of use cases and introduces a testable style for writing use cases, a process used extensively and successfully by IBM Rational Software. Based on the testable use case writing technique, you will learn how to create test cases from these use cases.
How to make a use case more understandable and testable