Your solution must be current, but must have the ability to evolve to provide support for decades, even if your vendor is bought out, as more than likely will happen.
Any of these seem like things you'd like to improve in your shop? If you're moving to an ALM solution, make sure you've got these things covered.
The 3 Fundamentals
So what happens when you try to glue together the best of breed of all of the ALM functions? Pretty much all of these architectural requirements lose out. How can something be low maintenance when I need 10 different administrator training courses to handle each of the tools. Ditto for customization. Go down the list. Yes, you can glue in traceability if you know the process and data ahead of time, and as long as the glue doesn't have to change whenever any one of the tools is upgraded. But there are three fundamentals of ALM that always must be used as a starting point:
1. A common repository for all management functions
2. A common process engine/capability across functions
3. A common user interface architecture.
Notice any word that's in Common across these fundamentals? Nobody disagrees with these fundamentals, but not everyone embraces them. Perhaps the VC engine doesn't need a database. The requirements tool has its own. And so forth.
There have been a couple of efforts to create ALM backplanes, reaching back into the early '90s. These have failed. The reason they have failed is that the premise is that existing tools can be plugged into backplanes. And this is absolutely true - as long as all of the tools share a common repository, a common process engine and a common UI architecture. And no, SQL does not mean common repository, a common scripting language does not mean a common process engine, and KDE or .net do not mean a common UI architecture. Still a backplane can work if the framework includes the repository, process engine and user interface architecture. It's just that existing tools don't fit the mold. (Now some vendors won't sign up to a framework because they want to sell their exclusive solution.)
If you don't have commonality in these areas, how is your multiple site capability going to span all of the ALM functions? How are you going to improve ease of use if there are several different user interfaces and/or behaviors to learn? How are roles and traceability going to cut across all of the functions if you have differing process engines, repositories and user interface architectures. How are you going to be able to customize when you have to look at the impact across a dozen different tools? What is your response time going to be like when you're trying to navigate traceability links?
That's why both MKS (perhaps with some exceptions) and Neuma, two Canadian companies, have set the architecture in place to include the repository and process engine across the ALM suite. The result is that both of these solutions are not just good ALM tools, but are easily and extensively customizable. When the framework is common, R&D goes into the framework so that all of the functions may benefit. In fact, Neuma is expected, later this year, to introduce a CM and/or ALM Toolkit product which consists primarily of its framework. That way, even the NIH (not invented here) syndrome can be addressed, and those who like to fine tune process, and extend it throughout the business, can have a strong starting point.
What do I get From 3rd Generation ALM
So ALM is the 3rd generation