Orders of Magnitude in Test Automation

[article]

To spark some conversations around automated test coverage, I developed the following heuristic to help tip me off to when we might need to take a look at what and how we’re automating. Often, I can turn up a couple of gaps or inefficiencies. I look for specific orders of magnitude in my tests. Ideally, I’d like to see:

  • 1000s of unit tests
  • 100s of automated customer acceptance tests
  • 10s of GUI-level automated tests
  • 1s of production monitoring scripts

 

1

Figure 1: Orders of magnitude in test automation

These orders of magnitude are all relative. If I had ten thousands of unit tests, I’d expect to see thousands of automated customer acceptance tests, and so on down the line. Conversely, if you only have hundreds of unit tests, you’re just getting started. I wouldn’t expect more than a handful of acceptance tests and none of the other automation. As your code base grows, the number of tests required to maintain coverage likely will grow as well, so the specific numbers aren’t as important as the relative sizing.

For some teams, even questioning whether they have these four types of automation in place can be revealing. It’s not uncommon for teams to only have one or two types of automation, ignoring the others. In my experience, few teams think about coverage holistically across all four areas at the same time. When you start looking at these numbers, you’ll often spark much needed conversations around “How much is enough?” and “What are those tests actually testing?”

Bringing Balance to Your Automation
I find looking at orders of magnitude to be a great place to start the conversation, but once the team begins talking about coverage, conversations quickly move to topics like code, data, scenario, or some other aspect of coverage. We often start to look at relative metrics, including: How long does it take to run each set of tests? How often are they run? We start to look at the different types of issues (if any) that are found by each set of tests.

Facilitating a conversation around your automation is the goal. If you have 1,000 GUI-level tests but no automated customer acceptance tests—why? It’s likely that most of the bugs that could be found by those GUI-level tests could easily be found at the customer acceptance level. The customer acceptance tests would likely be run more often and would cost less to maintain. If you don’t have any GUI-level tests, are you really covering some of the core user interface functionality that may need to be tested on a regular basis? 

As you start to identify areas where you may be too light or too heavy based on the heuristic, look for the following opportunities:

  • Areas that you’ve completely ignored. For these areas, you might start to build out coverage.
  • Areas where you’re more than an order of magnitude off. Ask yourself why. Did we do that on purpose? Do those tests provide the right type of value? Does that area really have too much coverage or are all the other areas light on their coverage?
  • Tests that you’ve written at a “higher” level (or closer to the user interface) that can be rewritten and managed more effectively at a “lower” level (or closer to the code).

As you start to address the gaps and areas where you might have too little or too much coverage, engage the rest of the team in a discussion around the best way to approach the problem. It may be that because of the type of product you’re building, this heuristic won’t work for you. That’s OK. Work with the team to figure out what type of mix might be a better representative of what you’d like to work toward. These numbers can become one of many factors you can use to get better alignment in your test automation.

About the author

Mike Kelly's picture Mike Kelly

Mike Kelly is a partner at DeveloperTown, a venture development firm. He regularly writes and speaks about topics in software testing and agile development, but spends most of his time working with teams to deliver ridiculously high quality solutions faster than they could without rigorous testing practices. You can learn more about Mike and his other works on his website http://www.MichaelDKelly.com.

AgileConnection is one of the growing communities of the TechWell network.

Featuring fresh, insightful stories, TechWell.com is the place to go for what is happening in software development and delivery.  Join the conversation now!

Upcoming Events

Sep 24
Oct 12
Oct 15
Nov 09