it gets down to costs. First it offers the observation that we have all heard before-the sooner you find defects, the less they cost to fix. They offer an example of 1X cost to fix a defect in requirements and analysis, 5X to find in coding and unit test, 10X in integration and system test, 15X in beta test, and 30X in post-release. I've seen them much higher-as much as 1000X in production-but whatever. The point is made.
Then, however, it attaches some real money. In the transportation manufacturing sector, the surveys indicated it cost 268.4 hours per major bug for repairs, $604,900 in lost data, and 1.5 months of delay in time to market. Then, somehow, it arrives at a cost per bug of $4,018,588. Don't ask me to explain this last number, I can't and they don't. But it is impressive!
In the financial services sector, those numbers are 18.4 hours per major bug for repairs, $1,425 in lost data, 1.5 months delay in time to market, arriving at a cost per bug of $3,293. We can only assume that the huge difference between these two industries is the difference between repairing defects in manufactured goods versus electronic data-the former is far more costly to fix than the latter, and has a much longer useful life.
Finally, the report boils it down to the cost per developer/tester of inadequate testing: $69,945, of which about half could be avoided through "feasible" improvements. Wow! I see some raise requests on the horizon. But guess what? Only 36% of the costs affect developers-the 64% majority affect users. Maybe we have been pitching quality to the wrong side of the house all along, and the latest trend of only doing IT projects that are user-funded will work in our favor.
All in all, this report has the most value by validating what we have always known, and attaching real money to it. Whether you buy these numbers (or whether your management does), there is value in walking through the way they were derived, so that you can make your own business case. And get some real money.