Management Myth #1: The Myth of 100% Utilization

[article]

This article also appeared in the May/June 2012 issue of Better Software magazine.

Better Software Magazine

A manager took me aside at a recent engagement. “You know, Johanna, there’s something I just don’t understand about this agile thing. It sure doesn’t look like everyone is being used at 100 percent.”

“And what if they aren’t being used at 100 percent? Is that a problem for you?”

“Heck, yes. I’m paying their salaries! I want to know I’m getting their full value for what I’m paying them!”

“What if I told you you were probably getting more value than what you were paying, maybe one and a half to twice as much? Would you be happy with that?”

The Manager calmed down, then turned to me and said, “How do you know?”

I smiled, and said, “That’s a different conversation.”

Too many managers believe in the myth of 100 percent utilization. That’s the belief that every single technical person must be fully utilized every single minute of every single day. The problem with this myth is that there is no time for innovation, no time for serendipitous thinking, no time for exploration.

And, worse, there’s gridlock. With 100 percent utilization, the very people you need on one project are already partially committed on another project. You can’t get together for a meeting. You can’t have a phone call. You can’t even respond to email in a reasonable time. Why? Because you’re late responding to the other interrupts.

How Did We Get Here?
Back in the early days of computing, machines were orders of magnitude more expensive than programmers. In the 1970s, when I started working as a developer, companies could pay highly experienced programmers about $50,000 per year. You could pay those of us just out of school less than $15,000 per year, and we thought we were making huge sums of money. In contrast, companies either rented machines for many multiples of tens of thousands of dollars per year or bought them for millions. You can see that the scales of salaries to machine cost are not even close to equivalent.

fig 1

When computers were that expensive, we utilized every second of machine time. We signed up for computer time. We desk-checked our work. We held design reviews and code reviews. We received minutes of computer time—yes, our jobs were often restricted to a minute of CPU time. If you wanted more time, you signed up for after-hours time, such as 2 a.m. to 4 a.m.

Realize that computer time was not the only expensive part of computing. Memory was expensive. Back in these old days, we had 256 bytes of memory and programmed in assembly language code. We had one page of code. If you had a routine that was longer than one page, you branched at the end of a page to another page that had room that you had to swap in. (Yes, often by hand. And, no, I am not nostalgic for the old days at all!)

In the late '70s and the ‘80s, minicomputers helped bring the money scales of pay and computer price closer. But it wasn't until minicomputers really came down in price and PCs started to dominate the market that the price of a developer became so much more expensive than the price of a computer. By then, many people thought it was cheaper for a developer to spend time one-on-one with the computer, not in design reviews or in code reviews, or discussing the architecture with others.

User Comments

40 comments
Anonymous's picture
Anonymous

Great points Johanna.

Most places I've seen target the 6 hours (75%) target. Although I'd have to agree that 50-60% is a better target for most development teams and could vary slightly by developer.

You touched allowing time for thinking to occur in the day to day cycle. I think you must also allocate time for research to happen. In order for development teams to continue to innovate and provide alternative solutions to ever challenging problems, they must have ample time to research new technologies.

I like to tell my teams to focus on three initiatives:

1. Coding
2. Thinking
3. Research

January 24, 2012 - 1:44pm
Anonymous's picture
Anonymous

Great points Johanna.

Most places I've seen target the 6 hours (75%) target. Although I'd have to agree that 50-60% is a better target for most development teams and could vary slightly by developer.

You touched allowing time for thinking to occur in the day to day cycle. I think you must also allocate time for research to happen. In order for development teams to continue to innovate and provide alternative solutions to ever challenging problems, they must have ample time to research new technologies.

I like to tell my teams to focus on three initiatives:

1. Coding
2. Thinking
3. Research

January 24, 2012 - 1:44pm
Anonymous's picture
Anonymous

Great points Johanna.

Most places I've seen target the 6 hours (75%) target. Although I'd have to agree that 50-60% is a better target for most development teams and could vary slightly by developer.

You touched allowing time for thinking to occur in the day to day cycle. I think you must also allocate time for research to happen. In order for development teams to continue to innovate and provide alternative solutions to ever challenging problems, they must have ample time to research new technologies.

I like to tell my teams to focus on three initiatives:

1. Coding
2. Thinking
3. Research

January 24, 2012 - 1:44pm
Anonymous's picture
Anonymous

Great points Johanna.

Most places I've seen target the 6 hours (75%) target. Although I'd have to agree that 50-60% is a better target for most development teams and could vary slightly by developer.

You touched allowing time for thinking to occur in the day to day cycle. I think you must also allocate time for research to happen. In order for development teams to continue to innovate and provide alternative solutions to ever challenging problems, they must have ample time to research new technologies.

I like to tell my teams to focus on three initiatives:

1. Coding
2. Thinking
3. Research

January 24, 2012 - 1:44pm
Johanna Rothman's picture

I like the way you separate research from thinking, Mike. Good idea! Make it transparent...

January 24, 2012 - 4:14pm
Johanna Rothman's picture

I like the way you separate research from thinking, Mike. Good idea! Make it transparent...

January 24, 2012 - 4:14pm
Johanna Rothman's picture

I like the way you separate research from thinking, Mike. Good idea! Make it transparent...

January 24, 2012 - 4:14pm
Johanna Rothman's picture

I like the way you separate research from thinking, Mike. Good idea! Make it transparent...

January 24, 2012 - 4:14pm
Doug Smith's picture

I love the myth of 100% utilization and I have seen it just about every place I have worked.

April 19, 2016 - 12:04pm
Johanna Rothman's picture

Doug, thank you. Yes. I have seen it also, almost everywhere. 

Part of the myth is that we, the workers, think we can work long hours and that's okay. I don't know about your university experience, but mine was full of people who said, "I can solve any problem inside a semester with a sufficient number of all-nighters." In a sense, we are trained to think this way.

I first stopped thinking this way when I took summer school classes (at university). I had one class at a time for 2-, 3- or 4-weeks. I found I was able to focus on one thing for the entire time and finish it. I was much more productive. I suspect many people have not had my experience.

April 19, 2016 - 12:51pm

Pages

About the author

AgileConnection is a TechWell community.

Through conferences, training, consulting, and online resources, TechWell helps you develop and deliver great software every day.