We outlined the success criteria for every agile team as follows:
- Capture velocity metrics
- Demonstrate increments with working software
- Deliver dynamic requirements
- Duplicate knowledge
The first three of the four criteria were directly linked to the Scrum process framework and reinforced the knowledge transfer from the courses and coaching. It is very important, that we did not state the criteria for example similar to “increase velocity by 10% each iteration”. As a matter of fact we did not use the velocity number in any communication other than with the project team itself. A team comparison or performance analysis was never intended would have been wrong anyway. We wanted make sure the team would know their velocity and use it as an instrument.We trusted that each team would track their velocity. That re-iterated the importance of metrics in planning and estimating and gave teams the tool to become more predictable. A higher degree of predictability was actually a goal of executive management when they introduced Scrum. Before agility, dates slipped or requirements were under delivered.
Demonstrating working software had a direct impact on the deliverables (definition of done). Demonstrating product features at the end of every iteration would also symbolize a real departure from mini-waterfalls (feature completion vs. phase completion).
The third criteria took care of changing requirements and that we expected teams to be flexible with their requirements management. We purposely did not ask the teams to deliver what they planned, but to deliver what was needed even if the requirements changed. It was important that the A-team reward success criteria couldn’t conflict with goals of the product strategy.
The fourth was the only criteria which was not “built-in” into the Scrum process but important from an organizational maturity perspective. We wanted to encourage every team to share successes and challenges with other teams. We did not state how and when, but we wanted to increase inter-project communication. For example, one team might decide to write an experience report on our internal agile blog, while another team offers a quick lunch amp; learn. All it mattered was that the teams were communicating and exchanging experiences among each other. The exchange could have been formal or informal.
After three months of successful sprinting, the team would receive the A-team award. Although we asked the teams to provide the dates and iteration info, the A-team certificate was issued when the teams signaled readiness. The information was only collected for the certificate and to support the award ceremony. We had public ceremonies for winning teams and funky certificates signed by executives. The A-team award was not limited to one award and every team could earn multiple awards over time. Therefore every team could ideally collect four awards a year. We also did not define the price for each award. That allowed us to be very basic during economic challenging times, but gave us room to grow in the future.
We created awareness of the program through internal campaigns and the word “A-team” became quickly part of everyday vocabulary.
Although we took a very AOL centric approach with the A-team program, I hope this article generates thoughts and ideas for recognizing maturity of agile practices in other organizations. Remember, this program helped with the roll-out of Scrum and was very broad. More experienced agile organizations might need to adjust the success criteria to make it more meaningful.
We know from requirements management that feedback is very important. Think about giving feedback to your teams not only about the product, but also how they are doing as a team. The A-team valued Scrum