Experiences Testing Safety-Critical Software

[article]
Member Submitted
Summary:
This paper presents experiences in testing critical software that supports flight systems developed by Lockheed Martin Astronautics in Denver, Colorado. This approach has not been proven in an academic sense, but has been demonstrated over the years to result in software that successfully performs missions. It is based on teams comprised of the correct skill balance in software and systems engineering, as well as using a defined process.

This paper presents our experiences in testing critical software that supports flight systems developed by Lockheed Martin Astronautics (LMA in Denver, Colorado. This approach has not been proven in an academic sense, but has been demonstrated over the years to result in software that successfully performs missions. It is based on teams comprised of the correct skill balance in software and systems engineering, as well as using a defined process.

We explore generalized process that has been applied to several critical software programs at Lockheed Martin, some experiences in applying the process, and the importance of the engineering team. We concentrate in the area of testing or Verification and Validation (V&V) of these systems, set in the context of the overall development concepts (processes) employed at LMA.

Some authors and researchers would have you believe that the process is most important. You select the right or “latest & greatest” method; a good set of techniques; buy your support software (tools); add in schedule and budget to support these; and presto, you have a reliable and safe critical software system. From our experience, some of this is true, however, we advocate that you cannot overlook the human factor. Good engineers who think are necessary for the success of the process, and there is no substitute for human thoughtfulness. Additionally, no single engineer can understand all expects of these complex systems. This requires the use of teams with diverse skill sets, as well as a good process.

Click on the file attachment below to read the complete paper.

About the author

jon hagar's picture jon hagar

Jon Hagar is a software tester, thinker, and teacher supporting software product integrity, verification, and validation, where he has worked for over thirty-five years. Jon is the IEEE project editor for ISO/IEEE 29119, co-chair for OMG UML testing profile, and works on IEEE1012 V&V plans. Jon consults, publishes, teaches, and mentors regularly as well as having a published book on mobile and embedded software testing, Software Test Attacks to Break Mobile and Embedded Devices, CRC press, 2013. 

AgileConnection is one of the growing communities of the TechWell network.

Featuring fresh, insightful stories, TechWell.com is the place to go for what is happening in software development and delivery.  Join the conversation now!

Upcoming Events

May 04
May 04
May 04
Jun 01