How Did I Miss That Bug? Managing Cognitive Bias in Testing: An Interview with Gerie Owen and Peter Varhol

[interview]

CP: OK. All right. You talked about rotating around. This leads into my question for Peter here. Peter, you've had various different roles in your lifetime, including being a technology journalist. Do you think the same tools for managing biases with testing can be used for people that are in different roles, regardless of the relation to the software industry or testing in general?

PV: The answer to that question is absolutely. We're all biased in how we do our jobs, and how we set up strategies, and how we collect data, and how we evaluate that data, and how we make decisions. But if he is a technology journalist, a product owner, a tester, a developer, I think the important thing to do is to be able to recognize that whatever stage we are in our careers and whatever we happen to be doing, our thoughts and decisions are influenced by those biases. I think the best thing that we can do is not in terms of eliminating them—we know we can't. But the best that we can do in overcoming them, and also compensating for them, is to be able to recognize that they exist.

CP: Now, Gerie, I have a question for you. You specialize in monitoring and developing offshore test teams. Do you find that there are a different set of biases found with offshore testers?

GO:: Well, I think cognitive biases—we're all human beings, and we all have the same mind and structure of our brains. So, in that respect, I think it's really the same. There's cultural differences, of course. It's cultural differences, as far as how our offshore teams approach testing and approach asking questions. I think, sometimes, that they also tend to—probably, the biases perpetuate more because they tend to really stick to the requirements and don't get into as much exploratory testing and whatnot. I think that probably also has to do with the structure of some of our contracts with the offshore and whatnot.

CP: OK. When there's that combination of an offshore testing and onsite, are the biases deterred, or are they magnified?

GO: The more eyes you get on a project, that would help to deter them, I would think. If you've got a new offshore team, a team that's been working on the application onshore, well, your offshore may see things like the curse of knowledge. The ones that have been working at it for a long time have that curse of knowledge, whereas your offshore teams, the fresh pair of eyes, could work the other way too.

PV: That's one of our prescriptions for dealing with biases, is let a fresh pair of eyes, let somebody who's not been intimately involved in the project, look at your data, examine your conclusions, and criticize them.

CP: I don't want to give away too much of your guys' presentation here. Could you give us another way of managing and coming over those biases?

GO: That's interesting, because Peter and I have a philosophical difference here.

CP: Oh, OK. A little bit of controversy. All right, I like it.

GO: You knew that the origin of bias—and this is all in the presentation—this is how it actually came out of Moneyball, the Moneyball presentation. The origin of bias is the conflict between System 1 and System 2 thinking. System 1 thinking is the intuitive thinking, the quick thinking. System 2 is the analytical.

When these conflict, then you have biases. Now, the way I see it, I think testers need to more intuitive. I think they need to do more exploratory testing, get into the emotional aspects. If something is not feeling right about the application, do more exploring in that area. Peter is of the opposite.

PV: First of all, I believe in, I'll say, alternating between the two so that neither one particularly wears you out. Use intuitive, but also use the analytical thinking and try to alternate between the two, and describe that as the difference between exploratory testing and test automation. I think automation is important from a repetitive standpoint in that you would like to have repeatable results, since one on the exploratory testing allows you to just say, "Well, what happens if I do this without necessarily including that in the script?"

CP: Yeah. No, that makes plenty of sense. Are there certain biases that you guys have found that plague your professional career?

GO: I think that those are the big three. The other one that I think also tends to impact testers particularly, and probably everybody, is the planning fallacy, which is that you tend to underestimate the amount of time a task is going to take. It's particularly critical for the testing folks because our time's always getting crunched anyway. If we underestimate it up front, we make it more difficult for ourselves.

About the author

Cameron Philipp-Edmonds's picture Cameron Philipp-Edmonds

When not working on his theory of time travel, Cameron T. Philipp-Edmonds is writing for TechWell, StickyMinds, and AgileConnection. With a background in advertising and marketing, Cameron is partial to the ways that technology can enhance a company's brand equity. In his personal life, Cameron enjoys long walks on the beach, romantic dinners by candlelight, and playing practical jokes on his coworkers.

Upcoming Events

Nov 09
Nov 09
Apr 13
May 03