Quality and the Internet Appliance

[article]
Member Submitted
  • Appliance configuration options
  • Documentation, tutorials, and help screens

Understanding the quality problems that might affect customers, the test engineers then set about developing test cases to provoke these failure modes. The engineers again worked cross-functionally to figure out how the product should behave in certain typical and atypical scenarios. These discussions allowed us to define over 1,000 test conditions.

Two areas of testing presented unique challenges. First, the appliance exchanges datasets with the server farm that may contain executable software such as the operating system, the browser, the address book, or the e-mail applications, non-executable data (called "content") in the form of e-mail, address book entries, news, or weather, or any combination of these sets of bits. Corruption or loss of any of these items could impair the operation of the device. Indeed, damage to a new OS or application could render the appliance entirely and permanently unusable.

To test this possibility, we put the appliances through extensive update tests. These tests covered storage limitations, power- and connection-failures, deliberately corrupted operating systems and applications, and other error-forcing situations, as well as normal uploads and downloads. We tested tens of thousands of executable (OS and application) update events and data (content, address book entry and e-mail) update events during the System Test effort.

One aspect of this kind of testing we recommend especially is end-to-end testing. For example, if the devices ships directly from the factory with the software installed, make sure the first "boot, connect, and update" sequence will work properly. Errors here will render the appliance dead on arrival, a decidedly negative customer experience of quality.

The second challenge involved usability. While usability is an important quality concern for most PC software and hardware vendors today, they have standards to guide their efforts. Microsoft Windows applications adhere to the Microsoft look and feel, while Apple applications use the Macintosh user interface style. These interfaces are too difficult and complex for our target customer, so we started over. Experienced usability designers worked and reworked the interfaces for the home screen, the content, the browser, the e-mail application, the address book, and the overall navigational paradigms. Early prototypes were taken to locations like airports for target market feedback.

hrough an extensive Beta program that included hundreds of users, we obtained more insight from computer-averse users. The Beta program began before System Test, and continued up to system release.

Finally, we made sure that the test team saw the product through customer’s eyes. We held frequent discussions about what a computer-inexperienced customer's reasonable expectations would be. The test engineers and test manager sensitized the test technicians to the need to remain critically aware of the user interface as they ran tests, and to verify correct behavior beyond simply the expected results of the test case. To ensure that we did't let our previous experience with—and acceptance of—PC quality glitches color our findings, we all adopted an active bias towards reporting problems. Any behavior that might mislead, bewilder, intimidate, or anger a customer, regardless of why it might happen, we reported as a bug. To implement the tests, we used a combination of manual test suites with automated support tools.

To implement the tests, we used a combination of manual test suites with automated support tools. Test engineers wrote manual test cases that test technicians keyed in and evaluated on the appliances. We used programs running on load generators to create representative, stress, and peak usage profiles on the server systems. Figure two illustrates the approach, omitting some components to focus on the elements that interfaced with testers or test tools. We had

About the author

Rex Black's picture Rex Black

Rex Black is President and Principal Consultant of RBCS, Inc., a consultancy that provides testing experts worldwide, serving clients such as Bank One, Cisco, Hitachi, IMG, and Schlumberger in consulting, training, and hands-on implementation. He has written Managing the Testing ProcessCritical Testing Processes, and numerous articles, along with presenting papers and keynote speeches at international conferences.

AgileConnection is one of the growing communities of the TechWell network.

Featuring fresh, insightful stories, TechWell.com is the place to go for what is happening in software development and delivery.  Join the conversation now!

Upcoming Events

Sep 22
Sep 24
Oct 12
Nov 09