Stop the Wishful Thinking: Software Estimation Reality Check

[article]
Summary:

Daryl Kulak tackles the most common beliefs in software development regarding estimating, and shows us ways and methods to help developers deal with the demands of businesspeople.

"How long's it gonna take?" We have to be able to answer that question accurately, right? Not necessarily. In this article, we'll examine the long-held beliefs in software development regarding estimating.

We Are Such Bad Estimators!
Show me a development shop that estimates its work confidently and accurately and I’ll show you a...what? Three-headed lizard that plays gin rummy and has a .350 batting average. We all hang our heads in shame at our poor estimating. Our business executives just want to know how long it will take and how much it will cost. Why can’t we tell them?

In a previous article, I posited that we need to ask different questions in the agile world. In the waterfall world, we try to answer "When will it be done?" but with agile we change the question to "How much can you do by this date?" That's fine if it's you, the agile team member, asking the question. But what about when your business stakeholders are asking the questions (as they always are)? The trick is all in the level of accuracy we strive for.

With Estimates, More Precision Is Not Helpful
As engineers, we always tend to think that increased precision must be good. But with spongy things like estimates, more precision is actually not helpful, and it can actually be troublesome. Here's how: When you say that a given product will cost "around $500,000,” you are implying that you either haven't put much thought into it or you might have uncertainties that keep you from being able to pin it down. But when business people hear you estimate numbers like "$513,000," they assume that you've got it all covered and you are signing in blood that you can get it done for that amount. In fact, they assume that even if you say "It's $513,000, but I'm not signing in blood that I can get it done, OK?" The power of that precise number outweighs any disclaimer you could possibly state.

This is where the agile tool called "storypoints" comes into the picture. Storypoints are meant to be a more vague unit of measure than hours or dollars. If a development team is equating some number of hours to a storypoint, it is missing the point (pun intended). Saying, "Thirty hours equals one point” is ridiculous. If you’re doing that, just use hours for goodness sake. You haven't gained anything from storypoints except to be able to claim to be doing "agile." The power of the storypoint in estimating user stories is that it is vague. Keep that power.

Let's play out a scene, first using hours, then storypoints.

Dev One: I think this card will take six hours.
Dev Two: Really? Six hours? You’re slow. I could do it in three hours, easily.
Manager: Are you remembering to count system testing in that time?
Dev One: Well, maybe I could get it done faster. And no, that doesn't include test.
Manager: Hey Tester, how long for you to test this story?
Tester: I'd say two hours, unless Dev Two does it, then testing always takes longer.

This takes forever as people argue over the specifics, the task's definition, and the scope, etc. This type of discussion is extremely wasteful.

But, we do need an estimate, so what if we use a storypoint scale instead? Let's say, one point equals a simple story, two points equals medium, and three points equals complex. The points are only relative to each other, that is, a two-point story should be more complex than a one-pointer and less complex than a three-pointer.

Now let's listen in again to our team room:

Dev One: I say one point.
Dev Two: I say two points.
Dev One: Why so high?
Dev Two: Because we gave two points to story number two, and this one is more complex.
Dev One: Yeah, you're right. Two points.

Using a relative storypoint scale cuts the disagreements and clarifications down by probably 90 percent. That's less waste.

Another technique, shown to me by Tim Wingfield, is to use a long table and place the user stories on index cards face up, one beside the other, putting them into complexity order, sliding them in where they fit as you pull them off the pile. Then you simply take the bottom third and say: “These are 'one pointers,' the middle third are 'two pointers,' and the top third are "three pointers.'"

So, using storypoints, we can cut the fat out of our estimation process, and we can also give a more vague estimate to our product owner instead of using precision to convey a level of confidence we do not have.

Storypoints can give an amazingly accurate picture of the forthcoming backlog effort, as long as you are gathering enough data points.

Short Iterations Provide Lots of Data Points
Which brings me to why I love short iterations. By short, I mean shortone week max. As a team is burning down storypoints, you will see lots of variation in the points burned from one iteration to the next (they’re vague, after all!), but eventually you'll get an average iteration burn rate that is meaningful. If you are using two-week iterations, it will take twice as many weeks to get the same number of data points as if you were using one-week iterations. More data points mean you can trust your averages sooner.

So, let's summarize thus far. Let's use storypoints to estimate storycards because points are nicely vague. Let's burn down points and base our future estimates (burndowns) on true history. The only meaningful track record for a team is what this team has accomplished recently, with this set of technology, in this business domain, with this product owner. Any other "estimates" are just wishful thinking.

I probably don't need to tell you this, but don't compare storypoint burndowns (velocity) between teams. In a previous article, I explained the dangers of this and other "mechanical agile" practices.

But, the fact remains, business people need estimates, and they need them often before we have begun iterating. What to do?

Shirt-Size Project Budgets
With an internal IT shop, the businesspeople come to us and say, "We've identified this potential project; tell us how much it will cost and then we can judge the ROI and prioritize it with the other potential projects." Software product companies deal with a similar statement: "Tell us how much it will cost to build this new product."

So, here we are in a quandary. I've already stated that the only way to estimate something is to base it on the actual history of “this” team working on "this" product. In this case, we do not have that. Yes, maybe we have a team that has worked together, but that was on a different product that didn't have a mobile component, or where we used an older version of Hibernate, so the estimate is not transferable. (You think it is, but it's not.)

What shall we do? Again, we must be vague. Should we give the stakeholders an estimate in storypoints? No. Storypoints are not meaningful to business people outside the team room. They want hours and dollars.

What if we decided that every project had to be either a small, medium, or large project and then attached real dollar figures to each "shirt size?" For instance, a small project might always cost $100,000, a medium one $250,000, and a large one $750,000. The specific dollar figures are not important, just the concept of shirt sizing.

What does this get us? Well, it certainly shortens the estimation cycle. With a brief look at the business problem and a cursory thought toward the architecture, we could throw each project into one of these "boxes" quite easily. But would we be right?

It doesn’t matter if we're right. This goes back to the "different question" in agile. Once we have a shirt size for our new project—small, medium, large—then we can ask the agile question, "How much can you do for this amount of money?" Then, it becomes a process of fitting "enough" requirements into the project given the budget’s constraint. Use agile's iterative, incremental nature to finish exactly on time and exactly within the budget, trimming the tail (thank you, Alistair Cockburn) of the requirements (storycards) to suit the budget.

But there are some products that require more work in order to achieve even the minimal marketable features, right? So, put those projects into the "large" shirt size. If we can't even accomplish the minimal marketable features for $750,000, then something is really wrong. With agile, we want to deliver software into production frequently. If you cannot slice off three-quarters of a million dollars worth of features, you need to take another look at the project, because you're seeing it wrong.

So, we now look at the potential projects the business brings us, do minimal investigation, and then quickly shirt size it and give the business people their estimate. Once the project hits a team room, the team knows what shirt size this is and how many iterations that will translate into (it is good if your shirt sizes neatly fit iteration lengths), and team members can ask themselves, "What can we do within this budget?"

If the shirt size turns out to be too small, the team (with the product owner's guidance) still tries to put the app into production as is. If that is not possible, then all the usual budget increase processes need to come into play. But what if the shirt size was too big?

Here is where "product owner impatience" becomes a very helpful thing. Most product owners I've met are anxious to get the app into production. In fact, a very strange thing has happened on my projects since switching to agile a decade ago. My waterfall product owners always increased scope and my agile product owners always decrease scope. For some reason, they are very focused on hitting that date and beating that budget much more so with agile. I think it has to do with the cadence of the weekly demos, where the product owners get more excited as they see more and more of the product coming together. They become impatient to get the features they can tangibly see and interact with today, and grow less enamored with the features still sitting in the backlog. This impatience keeps the team from running longer than necessary. Again, weekly demos will get twice the excitement value as every-two-week demos.

Let impatience grow and let urgency flourish; that’s my thesis. Well, actually, it's much more than a thesis, as I've used these concepts multiple times on the projects I've worked on. My teams have gained benefit from it, I hope yours do too.

Tags: 

User Comments

1 comment
Chris Riesbeck's picture

I've seen exactly the same thing about product scope in my agile courses at Northwestern. I teach an undergrad course on agile for developers in parallel with a 5-week masters course on agile for clients. I insist on weekly delivery, a slice up and running from nothing in the very first week, and user testing from week two onward. I've seen exactly the same result: with working slices in hand, and some real numbers on velocity, the clients de-scope quickly down their core value prop with very little push on my part.

July 18, 2013 - 10:55am

About the author

AgileConnection is a TechWell community.

Through conferences, training, consulting, and online resources, TechWell helps you develop and deliver great software every day.