Kent McDonald writes on how to manage business analysts without measuring them. You can do so if you view management as helping business analysts improve their skill sets while helping them be productive members of their team. If, however, you view business analysts as “resources,” you will more than likely find individual measurements quite useful.
There are many people who believe you can't manage what you don't measure. That thought carries over to the people working on delivery teams, especially if the organization tends to refer to people working on projects as "resources." As a result, I get asked quite frequently how business analysts can be measured. There is not a clear cut answer to this quandary, but I wanted to take a shot at answering it with a combination of my thoughts and the observations of others.
Why do You Want to Measure Business Analysts?
The first question I ask when I hear that someone wants to measure business analysts is "What are you trying to accomplish?" Some common reasons for measuring business analysts include: to assess how well they are doing; to aid in performance appraisals; and to identify areas in which they can improve. Considering that all other employees are measured, it seems to make sense to measure business analysts as well. The answer to this question often exposes some of the core assumptions that an organization’s leadership has in relation to its employees.
Organizations that view business analysts as another set of "resources" often will be looking for ways to measure how efficient and effective individual business analysts are and ways they can be improved. Measurement systems in these types of organizations typically focus on measures of output, such as the number of changes made to the requirements after the original business approval and the number of defects found in testing, production, or the completion of requirements by the target date. These types of measures also point to an assumption that the purpose of a business analyst is to write requirements. In other words, requirements documents are a business analyst’s end-product.
I am not a fan of this perspective on measuring business analysts for several reasons. First, the purpose of an analyst is not to produce requirements. The real purpose of her role is value management—making sure her delivery team is delivering the right thing. Requirements are merely a means of conveying that information and should not be viewed as an end-in-and-of-themselves. Jeffrey Davidson drives this point home in his post " Measuring Requirements May Reinforce Bad Behavior." Second, measuring things such as changes to requirements, and defects found drives sub optimal behavior. Analysts focus on perfect requirements and resist making changes. This does not reflect the real world where the team, including the analyst, does not know everything at the point where requirements are written so it is unrealistic to think that changes will not need to be made. Third, measuring individual analysts on these types of metrics discourages collaboration. If I am an analyst and I knew I was being measured on how quickly I get requirements done and how many errors were found after the fact, I am going to be less inclined to help the rest of my team when there is a backup in some other part of the process, such as testing. Metrics such as the timeliness of requirements lead to an increase in the use of the phrase "That's not my job." In addition, when gaps or errors are discovered (and they will be no matter how careful an analyst is) I'm more likely to spend a great deal of effort trying to classify these errors in a way such that "it's not my fault."