Balancing Exploratory and Automated Testing in Agile: An Interview with Matt Attaway

[interview]

MA: Not your mind state and it’s too easy to wonder off the path. They’re like “Okay, here’s the five paths that work.” Well, if you know the five paths that work it’s not a whole lot that I’m going to bring to it for looking for something, because my job is to find the stuff that doesn’t work.

JV: Sure.

MA: So I feel like when we’re in the waterfall level we tended to get more complete features. Development had been beating on this thing for three months, “threw it over the wall” as the saying goes. But you kind of, in that code and me as a tester, you actually tended to have some downtime. There’s always a little gap in between the moment from the hand-off and finishing up the last round of code, and so you had some downtime. You could kind of explore problems or try to.

I find testing is kind of a destructive behavior; you’re constantly always breaking someone else’s stuff, so it’s nice to have time to build code and collect yourself before the next round comes when actually you don’t have that. And so its’ been a lot of adaptation on our part of just this new constant flow of work and figuring out how to find those moments to pause even as the work keeps coming and coming, and in various states do quality where we worked.

JV: Right. And you were saying how you added automated testing, so how did you go about balancing automated testing with the exploratory testing that you obviously are experienced with?

MA: So I came to the conclusion we had a really small testing for the size of products we were testing and the number of products we were testing and the release cycles and so we needed to never do something that wasn’t useful once it was done. For me the thing that felt like it wasn’t useful once it was done was the test plan. If it’s been a fair amount of time, like okay, there’s still stuff coming in, let’s figure out what it is and use the situations in test cases; when they write it all up it’s meaningful for us.

It’s a good kind of exploration of the problems faced and to get it all out, but no one generally looked at the document afterwards, and agile stories evolve a lot. One week you would get a story and start doing your test cases and the next week they go, “Oh, we got rid of that design. Oh, that’s right you weren’t at the meeting, because we didn’t invite you.”

JV: It’s agile, it’s fast paced.

MA: Exactly. And that’s important. I like that adaptive nature of agile workflows and that people are always kind of rethinking and reworking and things and showing them to customers and getting that feedback is great. But it means, coming from a waterfall work test organization where that’s how we did things, the test plan, we got the code and we tested the code, it just wasn’t working, but we needed that traceability. You need to be able to come back to a feature and say “Yeah, this is all the stuff we ran against it.” So that when we’re doing an analysis to figure out what the root cause was we can say “Oh yeah, we just totally skipped that test stage. Oh, we skipped that platform. So I ended up doing two things.”

One of them was to change the way we wrote up our test plans, and what we started doing was using our exploratory testing to explore the product and find bugs the way we always have, and then we’d keep almost like a diary of these test stages that I ran. You might spend eight hours with a product and do fifty or sixty different attacks and at the end of it you would have this big list of attacks that you had run and platform variances. Then, we could just paste that right into the issue. I have the belief that the issue tracker is the one point of communication that developers and testers share.

Theoretically, you might have a wiki confluence or media wiki or something where you store design docs and you communicate about them. But there is no guarantee either party is going to go to that or pretty much any other tracking system. But your issue tracker, that’s where your testers are going to file the issues.

JV: Everyone is going to look at that issue tracker.

MA: Exactly. So I started putting as much information in that as possible. Most issue trackers are really easy to run reports against and all that jazz, so I build metrics afterwards to show how many test cases do we have for the epic or how many test cases for a story or how many of those are automated. It was really easy to just go to the issue tracker and say, “We don’t need this field, prime it as text.”

JV: Was this any particular issue tracker tool that you were using?

MA: We actually had one built into that we used and we actually use it now and are hooked into it. You can use either side and they’re integrated, which is pretty cool.

About the author

Jonathan Vanian's picture Jonathan Vanian

Jonathan Vanian is an online editor who edits, writes, interviews, and helps turn the many cranks at StickyMinds, TechWell, AgileConnection, and CMCrossroads. He has worked for newspapers, websites, and a magazine, and is not as scared of the demise of the written word as others may appear to be. Software and high technology never cease to amaze him.

Upcoming Events

Sep 24
Oct 12
Oct 15
Nov 09