very expensive walk through a cleared minefield.
Please, don't get me wrong. This kind of checking can be helpful, but it is not sufficient. The teams I work with also do exploratory testing around the newly developed features, compare test runs against random data from the previous build, and do performance and unit tests. Most of the successful teams I work with take a "balanced breakfast" approach.
Takeaways, No Matter the Method
Today we are talking about specification by example, ideally examples that a business person can understand. These examples can come from the whole team, with the programmers writing the automation. This leaves the testers with a collection of power functions they can change as simply as clicking Copy > Edit. Sometimes, testers can use these functions to reproduce bugs found in the field. I have even seen cases where the testers did the reproduction in the tool, then handed the failing case to the programming staff saying, "Here's the bug. The story is: Make this failing check pass without introducing any regression error."
It takes a lot of work to get here. All that time spent writing automation could be spent on new features, and the testers could be doing exploratory testing that has immediate value. Some checks become green and are forgotten, or future changes introduce different behaviors that are perfectly correct, yet cause tests to fail. All of these things are tradeoffs.
This morning, I reviewed the tests, pruning the old ones, asking "What if?" and adding comments and links in our online story-tracking system right within the check. All of that was based on many hours of work creating the automation system. Nothing is free, but would I go back to that old world where everything was by hand?
Let's just say that it is hard to imagine.
That's my story. Where are you on the continuum? What is your team doing?