A Game Plan for Rapid Test Planning

[article]
Summary:
Rapid test planning is a process of cultivating test ideas within minutes, rather than days. It's opposed to rigorous test planning, which may strive to document every test possibility. The mission is to get testing on the road now, finding critical things to test, and getting important information to stakeholders quickly. In this article, Jon Bach explains how easy it can be to tackle a rapid test plan once you've got a strategy in mind.

Management almost always wants test results quickly, even when we haven't seen the product yet. When faced with a time crunch, we can freeze, get angry, or see what we can do.

I've experienced paralysis and anger many times in my twelve years of testing, but the longer I stay in this business, the more I find myself "seeing what I can do." These days, I consider it a challenge when someone says "you have three days to test."

But I have to be careful of rushing right to the keyboard. There are all kinds of assumptions that I'll need to check, otherwise, I'll make a lot of progress in the wrong direction. Enter rapid test planning. For me, it involves a few exploratory testing "sessions" that result in lists of test ideas, issues (or questions), and risks. Exploratory testing is test design and execution happening at the same time-sometimes in the same second-with an emphasis on learning about the product and the project. An exploratory session could last thirty minutes to three hours. But the mission of the first session is usually, what can I test right now?

In my work for a Seattle software testing lab, clients usually don't give us a lot of testing time (because they often don't have a lot of it to give), so our promise to them is to focus more on delivering information than creating written test cases that we plan to run later. That's why I often use rapid test planning to frame exploratory testing.

Think of an American-style football game. You have four quarters, or roughly one hour of clock time, to beat the other team by designing plays that advance the ball down one hundred yards of turf while the other team tries to stop you. Plays can involve throwing or running the ball until you reach the end of the field for a touchdown. When the other team has the ball, it's your turn to stop them from scoring. Each team creates plans for offense and defense, but when the game starts, those plans might change based on how the plays are going. That's exploratory testing in action.

But let's say my football team is the thing we're testing. In other words, how do I know if my team is any good? If I'm a coach being interviewed, I may be put on the spot to tell them something valuable. The sports reporter might ask a question designed to catch me off guard to get a good sound bite for his story:

"If you were playing the Seahawks today, what would be your game plan?"

Wanting to provide a great quote, but thinking quickly, I might answer in terms of the following five elements:

  • Structure: The team's composition
  • Function: The team's capabilities
  • Data: What the team might "process" (i.e., what we do with the ball)
  • Platform: What the team depends upon
  • Operations: How I'll use my team's skills

Using a mnemonic like Structure/Function/Data/Platform/Operations (SFDPO) is considered a heuristic-a fallible method for solving a problem. It's not meant to be exhaustive or foolproof, just enough to start you quickly on the road to a useful solution.

So if, an hour before a spontaneous project meeting, a project stakeholder asks you to be prepared to answer questions about your plan to find bugs, you might take the same approach:

  • Structure: What is the software project comprised of? (Code, files, specs, wireframes, user docs, media, etc.)
  • Function: What is it that the software can do? What are its features and subfeatures?
  • Data: What kinds of data can the software process and

About the author

Jon Bach's picture Jon Bach

Currently a managing consultant for Seattle-based test lab Quardev, Inc., Jon Bach has been in testing for fourteen years, twelve as a manager. His experience includes managing teams at Microsoft, HP, and LexisNexis. The co-inventor (with his brother James) of Session-Based Test Management, Jon frequently speaks about test management and exploratory testing. Jon is coauthor of Microsoft’s Patterns and Practices book on acceptance testing (freely available online) and has written articles for testing magazines. Find him on Facebook, Twitter, or his many presentations, articles, and his blog at jonbox.wordpress.com.

AgileConnection is one of the growing communities of the TechWell network.

Featuring fresh, insightful stories, TechWell.com is the place to go for what is happening in software development and delivery.  Join the conversation now!

Upcoming Events

Oct 12
Oct 15
Nov 09
Nov 09