Automated tools streamline software test and certification
The LDRA tool suite also performs control flow analysis both on the program calling hierarchy and on the individual procedures. The rules of structured programming are applied and defects reported. The output of the control flow analysis is a call graph, showing exactly which functions are invoked by which others (Figure 6, below).
Click on image to enlarge.
Figure 6: Colorized call graph reveals the degree to which statement calls are executed for a given procedure
Static Data Flow Analysis follows variables through the source code and reports any anomalous use. This is performed at the procedure level and also as part of the system wide analysis. This powerful technique can detect a number of serious problems such as variables that are used before they are initialised or an array that is accessed outside of its bounds.
Static analysis by itself is useful but not sufficient. Developers need to be able to perform black-box and functional testing, as well as dynamic analysis. Of course, this testing needs to be performed against the requirements with test outcomes being fed back through the requirements traceability techniques discussed earlier.
Using the LDRA tool suite, developers can perform dynamic analysis, system testing, and even unit testing. The term unit can refer to a single function, a number of functions, a whole file, or even several files. Unit tests can be executed on the host, but preferably also on the target to ensure that each “unit” functions as expected. During unit testing, any missing functions need to be stubbed and a harness created in order to run the tests. Manually creating stubs and harnesses, as well as downloading and executing tests on the target can be a very tedious task, but with the right unit testing tool, all these tasks can be seamlessly automated.
Structural Coverage Analysis (SCA)
The problem with testing is finding a way to ensure that it is sufficient. The solution is to measure the effectiveness of the testing by using Structural Coverage Analysis (SCA). SCA, which uses code coverage metrics, refers to the degree to which the source code of a system has been executed during requirements-based testing (Figure 7, below). Through the use of these practices, developers can ensure that code has been implemented to address every system requirement and that the implemented code has been tested to completeness.
Click on image to enlarge.
Figure 7: Structural Coverage Analysis (SCA) reporting reveals which parts of the source could have been executed and which have not. For the function cashregister.cpp, for example, analysis shows 100% statement coverage but only 51% branch/decision coverage.
Clicking on a given line in the report allows the user to drill down to the source code itself, which is colorized to show code coverage. The tool suite utilizes the static analysis information to find the branching points and monitor them to determine when a particular block of code has been executed. This can be run on the host, a simulator, or the target itself. This code coverage information can then be mapped to requirements, demonstrating that testing has completely covered a given requirement.
Selling into safety-critical market sectors increasingly requires certifying software to appropriate standards. Automated tools like the LDRA tool suite streamline the process, simplifying requirements traceability, structural coding coverage, and adherence to coding standards while mitigating risk. With the LDRA tool suite for analysis of C, C++. Java. Ada, and assembler languages, design teams can get their products to market faster and more economically while guaranteeing that their customers will be satisfied and that the product will deliver reliable performance over the long haul.
Jared Fry is a Field Application Engineer for LDRA Ltd. He graduated from Western New Mexico University with degrees in Mathematics and Computer Science. His career began in the defense industry working for Lockheed Martin. There Jared served as a software engineer working on various projects ranging from missile and radar systems, training simulations, and software testing. With LDRA he leverages these experiences as a consultant, assisting clients throughout the development process to produce a quality and certifiable product.
Shan Bhattacharya is a Field Application Engineer for LDRA Ltd. He graduated from Cameron University and began his career in factory automation and robotics. He continued his career with various defense contractors including Lockheed Martin where he served as a Lead Engineer and finished his time as a Deputy IPT Lead. Shan has been with LDRA since 2007 and provides consultation for clients in various industries focusing on requirements management, software certifications, and development best practices.


Loading comments... Write a comment