Alan Page has done his share of hands-on testing and team management in his years at Microsoft (he's also the co-author of How We Test Software at Microsoft). And, in that time, he has learned that what you know isn't nearly as important as what you can figure out. In this interview with TechWell editor Joey McAllister, Alan discusses the importance of honing your critical-thinking skills and offers some tips for doing so.
Joey McAllister: You recently wrote about the importance of thinking critically when developing software and noted that some young programmers and testers fall into a trap of being able to "recite a textbook" but being unable to think critically about a problem. Why do you think they fall into that trap?
Alan Page: There are probably a lot of reasons for this, but there’s one really important thing to point out first. I don’t think the lack of critical thinking is confined to testers and programmers. Perhaps it’s just most profound there, since critical thinking is so important.
I suppose one reason for this is too much emphasis in schools on being able to ace a multiple-choice test or recite facts rather than problem analysis and discussion. Frankly, in these days when I can discover nearly any fact in a few seconds (depending on my Internet speed), facts (for me, at least) don’t carry nearly as much weight as critical thinking and analysis.
Learning also requires frequent discovery of new information and application of those new ideas. In other words, it’s not enough to just learn (no matter how much you know), and it’s not enough to just do (no matter how productive you are). Long-term success in any knowledge work field requires a balance between learning and doing.
Joey McAllister: What are some ways in which you’ve developed your own critical-thinking skills?
Alan Page: I’ve told this story about learning before, but I don’t think I’ve ever written it down. Once upon a time, I decided that I was serious about testing, so I bought and read a book on software testing. It had some great ideas and filled in several gaps I didn’t know I had. I spent weeks thinking about what I learned and figuring out how to apply it to what I was doing. For a short time, I felt like I had the knowledge of software testing firmly in grasp. Then, I read another book on testing. This book was also good. It filled in more gaps in my knowledge and made some more concepts make sense, but it also contained concepts directly in conflict with the ideas from the first book. I struggled with this for a few more weeks and then read a third book on software testing. This one, as you’d expect, had more ideas and filled more gaps (and conflicts), but it set off a big trigger for me. I began to form my own opinions about software testing. I reread each of the books and found ideas and approaches I liked and others that I thought were flat-out wrong. For some reason, I wasn’t able to take my brain out of “consumption-only” mode and think critically about the ideas until the conflicting ideas grew to a loud-enough roar. Since then, I’ve read over a dozen books on software testing and at least a hundred on related topics (engineering, leadership, innovation, etc.).