won't mention any in particular. But there are estimating tools that you put in, what I would consider somewhat risky, uncertain variables in the first place, and they crank through this elaborate equation and out the tail end comes a work breakdown structure that says it will take you 42.36 hours to do acceptance testing. And I always look at those and think, "Oh, my gosh, what if I smoked. What if I went out for 15 minutes, I'd blow this whole 18-month project."
CAROL: And I think we sometimes do it to ourselves in terms of when the numbers come out to two decimal places, we assume that they might be right.
TOM: Well, a basic rule of engineering is that you...you shouldn't allow precision that is greater than accuracy. And yet that is unfortunately, while people understand that in the abstract, they do it all the time. For instance, the British Project that went about automation of the Bond Exchange in London, was budgeted with an amount that went down to the penny. So, 638,477,912 Pounds and 61 cents. Something like that.
CAROL: Oh, my gosh!
TOM: But, when they were done, it had overrun by more than half. And they didn't actually finish. And not only that, but they couldn't say within a hundred million Pounds that they had actually spent. So, this is the kind of accuracy the project demonstrated and the precision of the original costing was ludicrous compared to the accuracy that they had demonstrated in the past. In fact, safety factors on the order of a factor of 2 in either direction, I mean if you really did declare what your uncertainty was consistent with the accuracy you demonstrated in the past, then it wouldn't be unreasonable to have a factor of 2 in either direction.
CAROL: What would you say that the tools are of risk management? What kind of things can we rely on? What can we look at?
TOM: Well, the basic, the most important tool, is a census; a simple list of the outstanding risks, which doesn't have just one or two things, most typically has 20 or 30. If you've just got 1 or 2 things, you know, we might be late or we might overrun the budget, you're not getting at the underlying risk, the causal risks. We are getting at the resultant consequences of those risks, but you're not getting at the things that cause those things to be risky. So, I would say a risk list that had 20 to 30 things, including the core risks common to all software projects, each one of them assessed to some extent, plus all of the risks that are unique to your project, just a list, so that'll be number 1. Number 2 would be risk diagrams for each one of those risks. There ought to be an explicit statement of how uncertain you are about that risk, about its causal factors, that shows what the most optimistic and what the least optimistic situation would be. So, explicit declaration of uncertainty. Beyond that, there would be risk brainstorms or risk identification sessions, a basic brainstorming exercise that you go through on a fairly regular basis to give people a chance to articulate risks that they hadn't thought of before, so that there is an ongoing process.
CAROL: What are the...I know people are sitting out there saying, "Okay, now we've heard that there's risks, we've heard that there's common risks." And they're waiting for me to ask you the pertinent question, which is, "What are those