Adopting aerospace development and verification standards: Part 2 – Source/object code verification - Embedded.com

Adopting aerospace development and verification standards: Part 2 – Source/object code verification

Editor's note: This is the second in a two-part series on the adoption of the avionics industry’s comprehensive certification and verification process by other embedded sectors.

Source code verification checks that test cases have exercised all applicable requirements and highlights the sections of code that have and have not been executed. The identification of non-executed code pinpoints a number of potential shortcomings:

  • Errors in the test cases
  • Imprecise or inadequate requirements
  • Not all requirements have been tested
  • Dead code, i.e., code which is impossible to execute

Code coverage has several levels of precision; as a minimum, coverage reports need to show whether a line of source code has been executed at least once by the set of test cases. Statement coverage is the lowest precision sufficient to satisfy the requirements of DO-178C for software components which have a safety level C. Greater precision is required as the safety level increases to A.

DO-178C’s requirement for structural coverage analysis. Section 6.4.4.2a of the DO-178C standard links coverage precision to the software level: “Activities include … analysis of the structural coverage information collected during requirements-based testing to confirm that the degree of structural coverage is appropriate to the software level. Table A-7 formalizes the coverage which must be applied for each level.” An edited summary of that table is shown below:

An edited summary of Table A-7 of DO-278C Section 6.4.4.2a

Statement coverage may be sufficient to identify missing test cases, as illustrated in the following code snippet:

   if (reading > 10.0) {
      led_mode = 4;
      }

   else {
      led_mode = 5;
      }

With red signifying covered code, it is clear that a new test case is needed to exercise the situation where “reading” is not greater than 10.

Statement coverage adequately identifies dead code such as the following:

   if (reading > 10.0) {
      led_mode = 4;
      }
   else {
      led_mode = 5;
      }
   if (led_mode == 8) {

      update_panel();
      }

No matter how many additional test cases are created, the call to “update_panel” can never be reached. The root of the problem may be a design error in another part of the code, or the code in question may not trace to a requirement. Either way, the problem has been identified and can now be addressed.

Statement coverage simply indicates that a line of source code has been executed at least once. Typically, due to “if-then” branches and loops, there are several routes through a software component and, above safety level C, each route must be exercised and reported as covered. This is known as decision coverage and may be illustrated by the following code snippet:

   led_mode = 0;
   if (reading > 10.0) {
      led_mode = 4;
      }
   update_panel();

The report shows that we have exercised the code with values of “reading” up to 10.0 but not above. Statement coverage would highlight this too, of course; however, it will not show the converse, i.e., values of “reading” above 10.0, but not below. And, we need to be sure that “update_panel” has been called with “led_mode” set to 4 and when left with its initial value of 0.

The highest level of coverage precision required under the directives of DO-178 is modified condition/decision coverage (MC/DC). This is reserved for software components assessed at safety level A and places the component under exhaustive testing to prove that each decision tries every possible outcome, each condition in a decision takes on every possible outcome, and each condition in a decision is shown to independently affect the outcome of the decision. In simple terms, we are concerned with the permutations of a compound condition as illustrated in this code snippet:

   if (reading > 10.0 or submode == 3) {
      led_mode = 4;
      }
   else
     { …

The coverage report needs to confirm that we have exercised the code where “reading” is both above 10.0 and below in combination with “submode” being 3 and some other value, i.e., 4 permutations.
Source code verification is vital in gauging the effectiveness of testing, whether in proving that all requirements have been satisfied or uncovering a problem with the design or requirements. This task cannot be undertaken manually, but instead requires investment in automated tools.

Fortunately, most test tools offer highly automated coverage analysis that is virtually transparent. Additional benefits gained from this include a significant increase in the quality of the code and of the overall test process. These benefits greatly impact the later stages of integration testing and beyond in object code verification.

Object code verification
While a keytesting element of many avionics programs, object code verification hasbeen a relatively unused technique outside the avionics industry.However, the increasing sophistication and safety-critical nature ofmany modern embedded control applications has led non-avionics suppliersto adopt object code verification.

DO-178C’s requirement forObject Code Verification. Section 6.4.4.2b (Structural CoverageAnalysis) of the DO-178C standard describes the requirement as follows:“Structural coverage analysis may be performed on the Source Code,Object Code or Executable Object Code. Independent of the code form onwhich the structural coverage analysis is performed, if the softwarelevel is A and a compiler, linker or other means generates additionalcode that is not directly traceable to Source Code statements, thenadditional verification should be performed to establish the correctnessof such generated code sequences.”

Object-code verificationfocuses on how much the control flow structure of the compiler-generatedobject (machine) code differs from that of the application source codefrom which it was derived. Such differences may occur for a number ofreasons, but compiler interpretation and optimization are primarycauses. However, given that traditional structural coverage techniquesare applied at the source code level, whereas it is actually the objectcode that executes on the processor, differences in control flowstructure between the two can result in significant gaps in the testingprocess.
As an illustration, refer to the two flow graphs in Figure 2 ,which are generated from the same procedure. Note how the object codeflow graph on the left shows a branch which doesn’t appear in the sourcecode flow graph on the right.

Click on image to enlarge.

Figure 2: Flow graphs

Visible,easy-to-use reports like these help engineers to quickly build testcases that achieve 100% unit coverage. Without such reports, the effortrequired to identify each path through the object code would be muchhigher, resulting in longer timescales and higher cost.

Safety-criticalsoftware components in aerospace systems that are DO-178C Level A mustundergo object code verification. This is arguably the toughest testingdiscipline for non-aerospace projects to adopt but must now beconsidered as more and more safety-critical software components aredeployed in modern automobiles, medical equipment, and transport controlsystems. Similarly, critical components in telecom and financialsystems are seeing increased quality requirements due to the highmonetary cost of failure.

Fortunately, safety-criticalcomponents are typically a subset of the application as a whole.However, the effort of testing at the object code level can besignificant and requires considerable resources in terms of time andmoney. Using automated, compiler-independent processes helps reduceoverall development costs by considerable margins and ensures deliveryof high-quality software components where the chance for failure hasbeen reduced as close to zero as possible.

Conclusion
Workingunder the constraints of the DO-178B standard was mandatory forcompanies such as BAE Systems and Lockheed Martin as they developedsoftware for the F-35 Lightning II Joint Strike Fighter, the Orion CEV[7], and other safety-critical projects, and DO-178C is now recognizedas the primary standard for similar aerospace development. However, asthese processes and directives of DO-178 are adopted as best practicesfor safety-critical systems outside the aerospace industry, non-avionicsindustries are facing the challenge of evolving their developmentprocesses and standards.

With the right tools and facilities,the scope of these challenges may be greatly reduced, thus enablingprojects to realize the full potential and benefits that rigorousquality analysis, testing, and verification may bring in terms ofincreased code quality, improved reliability, and cost savings. LockheedMartin’s ability to deliver the F-35 Lightning II for its first flighton-time and on-budget sends a message to other industries that softwaredevelopment and verification against rigorous and exacting standards is adiscipline which may be confidently attempted and conquered.

Part 1 – A coding standards survey

References

1.RTCA Inc. (originally the Radio Technical Commission for Aeronautics)is a private, not-for-profit corporation that develops consensus-basedrecommendations regarding communications, navigation, surveillance, andair traffic management (CNS/ATM) system issues.

2 . EUROCAE, theEuropean Organization for Civil Aviation Equipment, is a nonprofitorganization which provides a European forum for resolving technicalproblems with electronic equipment for air transport.

3. TheMotor Industry Software Reliability Association (MISRA) is acollaboration between vehicle manufacturers, component suppliers andengineering consultants which seeks to promote best practice indeveloping safety-related electronic systems in road vehicles.

4.“Guidelines for the use of the C language in critical systems”,published first by MISRA Limited in October 2004 and again in March 2013after comprehensive revision. These standards are complete reworks ofthe original set published in 1998.

5. “Joint Strike Fighter(JSF) Air Vehicle (AV) C++ Coding Standards for the System Developmentand Demonstration Program”, document number 2RDU00001 Rev D, June 2007.These standards build on relevant portions of the MISRA-C standards withan additional set of rules specific to the appropriate use C++ languagefeatures (e.g., inheritance, templates, namespaces) in safety-criticalenvironments.

6. The Chaos Report from the Standish Group hasbeen regularly published since 1994. The 2006 report revealed that 35%of software projects could be categorised as successful, meaning theywere completed on time, on budget and met user requirements. This is amarked improvement over 1994 when only 16.2% of projects were labeled assuccessful.

7. The Orion Crew Exploration Vehicle (CEV) is aspacecraft currently under development by NASA, the contract for itsdesign and construction was awarded to Lockheed Martin in August 2006.

Mark Pitchford hasover 25 years’ experience in software development for engineeringapplications. He has worked on many significant industrial andcommercial projects in development and management, both in the UK andinternationally including extended periods in Canada and Australia.Since 2001, he has specialised in software test, and works throughoutEurope and beyond as a Field Applications Engineer with LDRA Ltd.

Bill St. Clair is currently Director, US Operations for LDRA Technology and LDRACertification Services and has more than 25 years in embedded softwaredevelopment and management. He has worked in the avionics, defense,space, communications, industrial controls, and commercial industries asa developer, verification engineer, manager, and company founder. Heholds a U.S. patent for a portable storage system and is inventor of apatent-pending embedded requirements verification system. Bill’sleadership was instrumental in adapting requirements traceability intoLDRA’s verification process.

2 thoughts on “Adopting aerospace development and verification standards: Part 2 – Source/object code verification

  1. It looks like the sentence “DO-178C’s requirement for structural coverage analysis” should be a section title and not part of the paragraph text. Same with one of the paragraphs on page 2.

    Log in to Reply

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.