Assuming the Worst about Requirements

[article]
Summary:
Requirements are under suspicion. Read between the lines of software and project management journalism and you'll hear nearly everyone lamenting the sad state of requirements. Managers plan only eight percent of project time for requirements. Developers worry about Zen-like requirements that lack sufficient detail to produce serviceable code. Testers think they have to backfill requirements. So, what were the analysts doing?

Requirements are under suspicion. Read between the lines of software and project management journalism and you'll hear nearly everyone lamenting the sad state of requirements. Managers plan only eight percent of project time for requirements. Developers worry about Zen-like requirements that lack sufficient detail to produce serviceable code. Testers think they have to backfill requirements. So, what were the analysts doing?

Have you ever seen the comedy Office Space? In a scene right out of Dilbert, the person responsible for requirements is interviewed by two "rightsizing" consultants. Tell us what you do, they ask. He replies, I talk to the customer to find out what they want, and then I relate that to people who write the code. A wink between the consultants tells you this analyst is history. After all, can't the customer talk directly to the people who code the software? Maybe yes and maybe no. Sometimes developers are responsible for everything from requirements to unit testing. At the opposite end of the spectrum, no one person or group has the specific task of requirements definition. Marketing talks to the customer, architects talk to marketing, and developers talk to architects. Apart from these role-and-responsibility clashes, what else ails requirements?

How Requirements Become Suspect
Requirements often fail because they are too brief, too vague, too confusing, or all of that. To some, this implies that the meager time devoted to requirements is over-spending. The opposite is more often true—there isn't enough time. However, incorrect allocation of time is only a symptom; less obvious causes are more likely to lead to poor results:

We believe in miracles. Ever hear a story about how some genius sold his idea based on a napkin sketch? I think sometimes we expect our analysts to do the same. Hey, once the sketch is down, we're done! Time to hand the napkin over to production.

We get carried away by sports analogies. In some organizations, the requirements stage feels like the warm-up before a game. We spend just enough time to loosen up, because we are saving time for the real work. So, requirements becomes all about establishing the customer relationship and maintaining a friendly mood.

We believe in components. This is good, because once we really have requirements this could buy us time and produce reliable results. Without clear requirements, we risk assembling a van instead of the sedan that was requested. We heard "vehicle for the family," started thinking about which components to use, and went to work.

Assumptions Are Silent Expectations
You've heard the truism "To assume makes an ass out of u and me." We make assumptions all the time in our jobs, and sometimes it works. But a more reliable path is to make our assumptions public where they can be either rejected, or cultivated into clearer expectations when reasonable.

Here are common software development assumptions that can work for good or ill:

Experience is everything. When an experienced analyst starts a project, either she or her manager might assume she knows all the system issues and is asking the right questions. In a worst case the analyst may not probe enough, thinking this is like every other system she built. The manager might believe the experience lets him cut the time needed for requirements.

Communication is free. Some people believe that school taught us everything we need to know to about communication. Sometimes, even a Ph.D. in English can't guarantee that! By realizing our own filters and biases, we can work to remove any barriers or distortions they might create.

Anyone can do analysis. Maybe this is true of some people. Others may be good at analyzing software problems, but not at analyzing business requirements. The pursuit of one discipline in college tends to produce vertical thinkers—deep reasoning down a specific path to arrive at a conclusion. To be good at analyzing anything you need to learn lateral thinking*—broad, pattern thinking, which generates new patterns by changing perspective.

Software is easy to change. Hardware engineers used to believe that software could be changed "for free." If the software specs were vague it didn't matter, because software could be changed. Any organic software development run amuck will slip to the dark side of this assumption, which is a costly maintenance cycle.

What we hear is what is needed. Precise capture of a request might cause problems in two ways. One, we believed the meaning and request was clear. Two, given the first case we fall into the next—we don't ask questions. By asking questions we have an opportunity to verify a request. Questions can also test whether the customer's request would really solve their problem.

Reversing the Trend
Analysts could use a little respect, not to mention reasonable schedules for producing decent requirements. What can we do to change current opinion and practice? Clearly convey your perception of the product so others can see what you have and what is missing. This can also increase analyst and customer confidence. Here are three recommendations:

 1. Produce something demonstrable. There are techniques for every cultural style:

    • Your organization might build a simulation based on system requirements or an executable model that uses model content to run a simulation
    • IT departments have been using prototypes to good effect for years
    • A commercial software venture might use Agile or XP techniques where small cycles of requirements-gathering-to-code produce working parts of the final system in incremental stages
    • Anyone can produce visuals—models or pictures that help speak to what we know and what we don't

 2. Iterate. Once you've demonstrated something, you aren't done. You may have new information that needs to be incorporated into your requirements document

3. Keep a record. Include your findings, tasks, process, key insights, and timeline. Keep a written record of requirements in a form that is useful for technical staff—you'll need to hand this off. Take control by developing your own plan with estimates. Track actuals. Write down events, observations, setbacks, and breakthroughs in a journal. This will let you recall the story behind the numbers, so that you can review them in a personal or project retrospective. Through records, you can better understand your strengths and weaknesses and just might improve your skills.

About the author

AgileConnection is a TechWell community.

Through conferences, training, consulting, and online resources, TechWell helps you develop and deliver great software every day.