What IoT and Embedded Device Testers Can Learn from the Volkswagen Emissions Scandal

[article]
Summary:

In 2015, it was discovered that Volkswagen had equipped millions of its cars with software to cheat on diesel emissions tests. It was a team of independent testers that uncovered the fraud. Jon Hagar tells testers what they can take away from the scandal and gives some recommendations to consider in order to improve the test industry for IoT and embedded systems.

After years of promoting “clean diesel” as an alternative to hybrid and electric vehicles, in 2015 it was discovered that Volkswagen had equipped eleven million of its vehicles worldwide with software to cheat on diesel emissions tests. The software was able to detect when the cars were in an emissions drive cycle set by the Environmental Protection Agency. In the test mode, the cars appeared to be compliant with all federal levels, but when driving normally, the computer switched to a mode that produced nitrogen-oxide emissions up to forty times higher than the limit in the US.

Charges and consequences continue in the case. As further details emerge, we will know more about the ethical implications of the scandal.

To me, there is a lack of accountability in this situation. Should Volkswagen engineers and testers have noticed what was going on and spoken up sooner? The simple answer is yes. However, having worked for large companies, I know that it is very easy to hide under the corporate banner and even convince yourself that something is good when, in hindsight, it was obviously bad. We testers need to remember that we are all accountable for our character and our behavior.

Exploratory Testing Uncovered the Fraud

I recommend that many levels of and approaches to testing be practiced, no matter what system is under test. The lowest tier of testing is developer-level structural testing and analysis. Teams should move from there to integration testing, simulation-analysis validation, requirements verification checking, and on to full system testing in the field, to name just a few tests that are possible. Teams also should use different test environments, such as modeling, simulations, emulators, hardware in the loop, test labs, and field testing.

Mixing and matching the test levels and environments is an essential part of any test job when doing planning, strategy, design, and implementation. In the general context of embedded systems and the Internet of Things, there is no “best” approach to mix and match test concepts. Teams must consider factors such as risk, bias, cost, schedule, functional and nonfunctional tests, and device features.

The basic functions of the Volkswagen diesel cars were met—that was shown. The problems were in qualities that were not met, such as emissions standards, mileage performance, and power deliveries. When the scandal came to light, lawsuits were filed, VW’s stock price crashed, cars could not be sold, and there has been a lot of bad press. Who among us would want this for our company or product?

The problem was found by totally independent testing in the field. Those engineers were truly thinking outside the box. They designed a special “sniffer” test tool that could detect emissions while the vehicle was being driven. The results were different from the “nonfield” lab tests used by the government.

The independent team was doing exploratory testing in that they did not totally know what they were looking for. They likely suspected that bias existed in developer and independent lab testing, so they undertook a field test environment to obtain objective results. This is a good approach to testing.

Practical Takeaways for IoT and Embedded Device Testers

When it comes to improving the test industry for critical systems, such as medical devices, self-driving cars, and control systems, there are many recommendations for testers to consider in addition to more independent testing. These include uniform standards, improved government regulations, more ethics training for employees, and increasing whistle-blower protection and incentives. I will not speak to the effectiveness or efficiency of these ideas, but testers can think about which ones might be reasonable for their contexts.

Embedded and IoT devices can be different from other IT systems because of the software, hardware, and system interactions with the environment and people, so they deserve to be thought about differently when it comes to testing. This is particularly true when you consider the possible dire results that can occur if the software were to fail or malfunction. Consequently, here are some ideas that testers may wish to consider when working with these important systems:

  • Expand developer-level testing within your teams
  • Advocate for more in-depth verification and validation efforts at many levels of testing
  • Watch for individual and team bias
  • Drive testing with “break it” and “run scared” viewpoints
  • Have better internal test labs using modeling, simulation, and automation
  • Plan multiple levels of software, hardware, and systems testing driven by risk- and math-based approaches
  • Develop more knowledge and skill to become more creative testers
  • Conduct continuous and ongoing field testing and demonstration
  • Use test and field data analytics driven by AI and taxonomies
  • Practice critical thinking and listening with ethical behaviors at all times (and don’t be afraid to speak up)

In the field, testers can test and monitor systems to record data for later analytics. Field testing confirms assumptions that may have been wrongly made by developers or during lab testing. The introduction of independent verification and validation (IV&V) is an added level of assurance that comes from the space industry, which continues to create autonomous systems that last many years. When done correctly, IV&V has a proven track record. As we move to self-driving cars with AI and features that are hard to test, maybe IV&V should be more strongly considered.

Finally, pay attention to the Volkswagen story as more information is uncovered. We must learn from history to avoid repeating it.

User Comments

3 comments
Ed Weller's picture

John

 

Great article. Based on attendees in multiple test classes, my assessment is that we are a long way from management understanding the necessity for what you are recommending. As long as "Test" is viewed as an overhead function, change (value of test, investment in testers) will be a tough road. It is up to us to show the value of testing as a positive contributor to the company/organization/projoect. Your article certainly makes the case

January 17, 2017 - 6:40pm
Jon Hagar's picture

We keep chipping away at the views people have about test.  Enlightened organizatoin seem to get it.  Ethical testers may get it too.  We build understanding as times go on. 

January 21, 2017 - 9:01am
Mark Bentsen's picture

Just remember the "S" in IoT stands for security...

January 24, 2017 - 3:53pm

About the author

AgileConnection is a TechWell community.

Through conferences, training, consulting, and online resources, TechWell helps you develop and deliver great software every day.