the closest fifteen minutes (let alone the number of hours) yet I routinely encounter professionals who cite hour estimates with at least 1 decimal place (doesn’t this imply that your estimate is accurate to the closest tenth of an hour or 6 minutes?)
6. Use common sense and statistics to correlate collected data, and question figures that seem out of line . Don’t accept data purely at face value without verifying its consistency or accuracy. Many companies collect work effort data on completed projects, but the definition of project work effort can vary widely across different teams (e.g., overtime recorded/not recorded, resources included, work breakdown structure, commencement/finish points, etc). Be careful not to compare data that appears comparable because of common units (e.g., hours) that is actually based on different measurement criteria. For example, two projects may report 100 development hours, but one included overtime and user training hours while the other did not. Although the units are the same, the hours are not comparable. "Project hours" has no industry wide definition and can vary widely–ensure that your organization has established a consistent definition for collecting and reporting project hours for any projects included within the scope of data collection. Further information and tips about how to ensure consistent project effort tracking will be presented in an upcoming article.
These are a few of the factors, both human and technical, that can lead to software measurement success. There is a great deal to be gained by tracking and controlling software development through measurement–if only companies would consider what various measures can provide, rather than seeking a non-existent silver bullet to solve all their measurement needs
|Making Software Measurement Really Work||86.7 KB|