During the last 6-8 years, I've seen companies purchase a variety of sophisticated tools supporting software development, some of which claim to integrate the strategic activities and artifacts of requirements, design, production, test, and deployment into releasable software products. Any tool that can do all that is certainly worthy of consideration for implementation into the software engineering environment.
Such tools can make life easier for many while assuring customers product integrity. But why is it that these tools rarely find their way into that environment? Are they too complex? Do they really have what it takes to do the things they claim? Are organizations fully prepared to implement and use such tools? Are people prepared and committed to make some very significant changes in the way they do software development? All are interesting questions.
Perhaps, and most likely, the reasons why such tools are not successfully implemented is an overall lack of training or exposure to the specific techniques upon which the tools were based. Training budgets tighten and companies hire people they feel "should have" current knowledge of newly released tools - even before they are fully tested and proven worthy. Also, people are typically resistive to change, thus, when introduced to a tool that requires them to learn "something new" or how to produce artifacts and work products supporting a different methodology, there is usually some initial resistance. A case in point of how people tend to resist change is in the following example:
A group of software developers that are used to interpreting functional requirements from a requirements specification document or similar work product and have little knowledge of object-oriented methodology are suddenly required to do object modeling. That is, management decides that object modeling is sexy and wants to institute a change to the way system or software design is accomplished. Tools are purchased and the company methodologist or technologist does a presentation to roll out the new design method. The next day, analysts and the design group look at each other and try to sort through what they have "learned" and apply it to what needs to be built .
It's hard to lay blame on anything specific companies are or are not doing when it comes to automating, but it boils down to training, acceptance, gradual change, guidance, mentoring, patience, and some degree of enforcement. A clearly understood system or software development lifecycle (SDLC) model is key to understanding what, why, and when certain activities occur and certain work products are produced throughout the lifecycle. Without knowledge of SDLC elements (phases, milestones, baselines and their associated inputs, processes, and outputs), automating processes for the sake of automating is a death march - that is, a high degree of certainty that the automation will not succeed.
At many organizations, configuration management (CM) activities are "half-vast" at best, and clearly chaotic. This may be due to the fact that the wrong people have been assigned to perform CM tasks or the project manager is either clueless or knows very little about CM benefits. Then at some point and perhaps out of utter frustration, some genius gets the bright idea to "automate" - thinking that this is going to solve their dilemma, so the organization focuses on automating their chaos! Am I the only one who has witnessed this undertaking?
If you haven't yet read the many articles written on this topic by now please do yourself a favor and research why there should be an elaborate analysis performed prior to acquiring a solution to automate CM processes. When considering automation, ask the following questions (and more