It is better to know some of the questions, than all of the answers.– James Thurber
Recently, Lee Copeland and I completed a test process assessment for a major financial services company. Our assignment was to understand this organization’s current testing process and make recommendations for improvements.
The foundation of any successful assessment is interviewing a diverse cross section of the staff. We interviewed team members who were involved in current software testing projects. We spoke with people from different project types including new development, product updates, and emergency releases. We interviewed people working with different technologies such as mainframe, client-server, Web, and voice technologies, and we met with team members from different office locations. Listening to this varied group helped us build a picture of this organization's software testing process.
Although Lee and I have different professional experiences, our elicitation techniques are quite similar. We interviewed team members, carefully listening to their stories, and took detailed notes. Through their responses, we were able to piece together a clear and realistic picture of our client's testing process. As we reviewed our notes to analyze the situation, it occurred to us that we could also analyze our questions to learn more about how we perform interviews and how we could improve our own elicitation processes.
Our Elicitation Process
While we had a formal list of topics we needed to ask about, most of our questioning was exploratory. We wanted to learn about the client's testing realities, so we captured information without bias or interpretation. As we learned, we tried to uncover more of the narrative. We asked further questions based on our findings: Open-ended questions led to rich stories; focusing questions closed gaps capturing important, missing details; and closing questions helped wrap up the interview. We wanted to identify additional stories and any important, omitted details.
We asked interviewees to share stories about recent critical incidents. We sought examples of excellence or demonstrations of failings: "Tell a story about a project that worked really well"; "Tell us a story about a project that failed miserably." We also asked for stories of typical project experiences. These stories helped us identify several objects of testing, including plans, documents, reports, test cases, and other artifacts.
Once we understood the objects of testing, we could relate them to the actions of testing. "What do you do with these objects?" "How do you create them?" "How are they processed?” Often the nouns of the narrative told us what the objects of the testing project were. Often the verbs of the narrative told us what actions were done to these objects. We inquired about the sequence of actions and the evolution of objects, so we were able to compare and contrast process flow between similar stories from different sources. Differences led to rich comparative questions, which helped us ask test leads and managers "why?"
After analyzing the client's current practices and writing test process improvement recommendations, we returned to our interview notes and undertook a thorough analysis of our questioning styles. We made a list of the questions we had asked, organized them into categories, and sought to discover commonalities and flows. The following describe the types of questions we found effective.
Setting the Stage
Interviewees need comfort; the appearance of outside consultants may appear threatening. Interviewees are not sure why they have been "summoned to appear," and they may be worried that the things they say will be used against them at some future time.
We began by putting the interviewees at ease by setting the stage for our discussion. We made sure they knew why we were there. We explained our commission from their management, what approach we were using, and what we were trying to learn from the information we collected. We also explained why they were there—that they had been selected because of their in-depth knowledge of the organization's processes.
During the interview, we took notes as the interviewees related their stories. At the beginning of the interview we mentioned that we were going to take notes so that we could capture and remember the details of their stories. We also stated that they could and should speak freely about their organization—that their names would not be attached to any negative comments they made. It is absolutely vital that this promise be strictly honored.
Our handwritten notes were of two types—directly related to the test process improvement model we were using and free-form notes recording anything interesting we learned. Rob also used mind maps to create a visual outline of their stories. Mind maps allowed us to create a visual understanding of the testing process described by each interviewee and helped us detect missing actions, objects, inconsistencies, and asymmetries in each story.
We often began the interview with these questions: "Do you know who we are and why we are here?" and "Do you know why you are here?"
Interviewees would tell us what they thought we were doing, then we discussed and clarified our purpose before diving into the interview. We wanted to show respect to the interviewees, so before starting in with questions, we thanked them for taking time out of their chaotic schedules to meet with us.
"I didn't expect a kind of Spanish Inquisition" is a classic line from a Monty Python skit. We didn't want interviewees to feel they had been dropped into the middle of an inquisition, so, in order to build rapport, we often began by asking questions about the local community—what to see in town, interesting cultural activities, sporting events, entertainment, and, of course, fishing. We asked where we could find good catfish dinners and Indian cuisine. We also asked where test consultants could do their laundry—always a great ice breaker. These questions were good natured and never scripted. We sincerely wanted to learn. Note that it is generally not a good idea to ask personal questions of interviewees. While acceptable to some, others might be offended, and starting an interview this way is sure disaster.
We encouraged interviewees to share specific, recent experiences to help us learn what they do. There was always a risk that interviewees would start quoting process manuals even though the documents may rarely be followed. Our goal was to learn what really happens. Our questions were designed to elicit a narrative. Many testers have stories of heroic initiatives or terrifying failures, and relating such tales can help expose important evidence.
We let the interviewee tell his story with minimal interruption, but occasionally redirection was needed to get back on topic. We were interested in the facts and avoided jumping to conclusions. Interviewers must avoid interrupting the story with suggestions or recommendations; patience is in order. Once the narrative was told, we used non-judgmental, clarifying questions to fill gaps or explore new ideas.
We used these tactics to learn about interviewees experiences:
Some candidates omitted relevant stories or experiences. These are a few questions that were useful to learn more and often led to new or related stories:
To find out what really happens on projects, avoid asking questions that require the interviewee to judge his peers. Avoid questions that would place blame on individuals or teams. Instead, focus on behaviors and deliverables rather than the individuals.
Interviewees occasionally need guidance to help them advance their narratives. Developers and testers can get pretty absorbed in describing a single activity, losing sight of where it fits into their stories. Some questions helped interviewees become better story tellers. They learned to question themselves as they spoke. As interviewers, we needed to prime the pump a few times, but once the juices were flowing, a rich narrative often followed. Some questions that helped to move the conversation along include:
Filling gaps with an occasional clarifying question can help build a better understanding of objects, actions, roles, responsibilities, and the relationships between them. Mind mapping the interview can really help find holes in the story. Clarifying questions can also remove ambiguity. Some clarifying questions we used are:
We used these tactics when the interviewee pivoted to a new topic or prematurely concluded the story:
When we are concerned that the interviewee is overgeneralizing, a clarifying question can help us understand scope. For example:
Recapppping and Reflecting
It is vital that we understand key concepts from the interviews, so we ask interviewees to validate our interpretation by restating the story from our notes. This recap often leads to additional clarification.
We sometimes retell the story in a chronological order to make sure all events are understood in sequence. We ask interviewees to identify missing activities or gaps in time:
Interviewees sometimes stray to a part of the story that contains a lot of details (some even interesting) but does not advance the narrative. We bring the interviewee back to the main thread of the discussion with a gentle nudge: "We were talking about _____."
Maintaining a Connection
Keeping the interviewee engaged is all about making a great connection with the interviewer. The interviewer must be sincerely interested in the interviewee's story.
Statements like the following can help the interviewee feel more comfortable:
We used two great questions to help close the interview. In doing so, we discovered a number of important pain points. The questions are:
At the end of each interview, we always thanked the person for sharing his time and his story. We expressed our sincere appreciation for his time and knowledge. We explained that his comments have helped us put some more of the organizational puzzle pieces together.
Between each interview we took some time to review what we’d heard, document our understanding, and plan for the next interview. We always used the results of previous interviews to develop questions for future interviews. We tried to identify gaps in what we heard and sought to fill in those gaps in subsequent discussions. Our questioning style was an “exploratory” one in which we sought to understand what was being shared, and what we had not yet heard, seeking to add pieces to the puzzle and helping us develop a better mental model of the current situation.
Questions Make a Difference!
Perhaps the most famous question in literature is from Shakespeare’sHamlet:
To be, or not to be: that is the question;
Whether 'tis nobler in the mind to suffer
The slings and arrows of outrageous fortune,
Or to take arms against a sea of troubles,
And by opposing end them?
Lee and I may have been graced with our fair share of inquisitiveness, tact, and patience. But I doubt we have the eloquence of Shakespeare. Asking the right questions can elicit valuable information and focus us on fundamental truths and core values that really make a difference. Asking the questions right makes the interviewee comfortable in sharing his real and relevant experiences. Strategic questioning exposes rich stories whose narratives reveal what is really happening. People and their experiences help us learn how processes can be improved. Quality questioning involves concurrent exploration, learning, and adaptation. In your future work, don't just focus on the answers, also focus on the questions.
Better Software magazine: X Marks the Test Case: Using Mind Maps For Software Design