The Value of Positive Testing

[article]
Member Submitted
Summary:

Is there any value to positive testing? Some experts say No! I think there is--and here's why...

There is a school of thought in software testing that debunks the value of positive testing. This school basically states that any test that does not produce a defect is not a good test. I respectfully disagree.

Software tests can be divided into two categories: positive tests and negative tests. A positive test is used primarily, if not solely, to validate that a given system, function, operation, etc. works as designed when a user enters the right data, in the right place, at the right time, clicks the right buttons, etc. Negative tests try to purposely break the system to verify that the system responds as expected and fails gracefully when it gets the wrong data, in the wrong place, at the wrong time. It’s with negative tests that we really earn our stripes as testers. A negative test should cause an error. It's expected to cause an error. If it causes an error, and the error is handled correctly, the test passes. So to recap–a positive test should not cause an error. Does that make it an invalid test–absolutely not!

The ultimate goals of each of positive and negative tests are completely different. To my thinking–we need to verify that something works correctly before we try to break it. If we don't know that it works correctly, then how can we know when it doesn't? Positive tests answer that question. If the system doesn’t work as it’s supposed to when everything is correct, all other tests, especially negative tests, are really irrelevant.

Let me state right up front–I'm not a professional software developer–not even close. I never claimed to be one. But I'm no slouch either! Let's just say–I like to dabble. Given enough time and the right reference books, I can build a class, or a Web page, or whatever. From my limited experience, I like to make sure something works first, and then I focus on what happens when it doesn’t. It's also much easier to build. Call me crazy, but that just make sense to me.

Imagine my surprise when I was recently told that some negative functionality would be delivered before the positive functionality. I was perplexed! Surely, they could not be serious! They were. The results were predictable. I couldn't create and save a basic record but the error handling was really nice. In a nutshell, the system could not do what it was designed to do but it looked really good.

It reminds me of a car I bought in high school. It looked really nice in the driveway. It had to sit in the driveway–it rarely ran. But I washed and waxed it every weekend! It also had an awesome 8-track stereo system! I spent hours sitting in the driveway listening to my Led Zeppelin and Aerosmith 8-tracks. Life was good! I remember a friend once telling me that the car was over-rated and didn’t perform anywhere near they way it was rumored to perform. I just wanted it to start and take me to the store. How fast I got there was really irrelevant at this point. Apparently it also got really bad gas mileage. In the late 70’s, with gas prices hovering around 80 cents a gallon this was pretty important. Of course, sitting in my driveway it got great gas mileage!

Automotive magazines would advertise all kinds of devices to boost miles per gallon. Again, my "friends" would encourage me to purchase all of these gadgets. "They will pay for themselves after just a couple of tanks," they would say. I could care less. I just wanted the stupid thing to run, and then I’d worry about how much gas it used.

I view positive tests in much the same way. Show me it works like it’s supposed to and then we'll worry about what happens when it doesn't or how well it performs (or doesn't).

I always write and run positive tests first. Once the system can pass the positive tests the fun starts. Now I get to be creative and break it (insert evil laugh here). Equivalence Class testing, Boundary Value testing, etc. are all great test techniques, but they are effectively useless if the system isn’t functioning correctly to begin with. If the system is failing with valid data, it doesn't really make sense to test with invalid data–yet. Unless, of course, the system accepts the invalid data–that would be bad. But that's why we need to test both positive and negative scenarios. Test the positive first to make sure the system responds correctly to good data, correct sequences of operation, correct user actions, etc. Then–and only then– can we validate what happens when entering invalid data, incorrect sequences, or incorrect user actions.

My preferred sequence of tests–run all the positive tests first, and if they’re successful–jump into the negative tests. If the positive tests don't pass I halt all testing until they do.

An additional benefit of positive testing–smoke tests! When you receive a new code drop or build, what better way to validate the core system functionality than to run through your suite of positive tests? Positive tests are my first automation candidates. They are typically tests that are quick and easy to run. My smoke tests will usually consist of the entire library of positive tests, or a large subset of them (the critical ones at least). I like to target no more than 30 minutes to run a valid, end-to-end smoke-test. With a good test automation tool you can achieve a lot of testing in 30 minutes. I like to run an automated smoke test with every new build, on every environment. If we're doing daily builds, I run a daily smoke test. When the smoke test passes I can be reasonably sure I have a good system to begin more in-depth testing. I can accept the build, and start my test clock. If it fails–I can kick it back.

For a bit of extra incentive–consider the doughnut factor. If the smoke test passes, I buy doughnuts for the team.If it fails, the development team buys the doughnuts. I hear bagels work too.

About the author

AgileConnection is a TechWell community.

Through conferences, training, consulting, and online resources, TechWell helps you develop and deliver great software every day.