Software bugs are notoriously difficult to eradicate. Traditional quality assurance techniques like testing and software inspections (sometimes called code reviews) find serious bugs, but too many bugs slip through.
This is of particular concern for embedded software developers because embedded software works behind the scenes in ways that users take for granted; in some cases a user's safety may depend on the correct functioning of the software. Corrective actions—like rebooting or updating the software—are disruptive or even impossible.
The Zune bug, where an infinite loop occurred because the device was unprepared for a leap year, is a recent example of a simple embedded software bug that rendered a device useless. In this case, users were deprived of their music for 24 hours. In safety-critical devices like medical devices and spacecraft, similar bugs have led to extensive financial losses, injuries, and even deaths.
Many tools and techniques have been applied to cut the number of bugs in embedded software. Some reduce the odds of a bug being created, while others detect bugs before software is shipped.
These tools and techniques are often complementary, and therefore suited to a defense-in-depth approach where they are applied together. Traditional techniques are increasingly supplemented with the use of static analysis tools and the adoption of coding rules.
The role of static analysis
A static analysis tool is like an automatic reviewer for your code. It reads the source code (without executing it) and looks for cases where it will behave in an undesirable manner—for example, dereferencing a null pointer, dividing a number by zero, or overflowing a memory buffer.
Static analysis tools do not depend on sample input—they can infer the software's behavior based on just the source code. When a bug is found, the tool reports its location to a software engineer, along with the information needed to diagnose the problem.
Since they do not depend on sample input, static analysis tools can investigate program behavior in corner cases that are not anticipated by testers and human inspectors.
While no tool can find all bugs, modern static analysis tools generate valuable results with minimal false positives, even for projects with millions of lines of code.
Coding Rules and The Power of Ten
Software engineers have long argued about issues like code layout and naming conventions. Coding rule efforts like Gerard Holzmann's “Power of Ten” and the Motor Industry Software Reliability (MISRA) guidelines for C and C++ are more principled, representing a codification of “best practices” for embedded software development.
They focus on improving software by making it harder to create (and easier to spot) bugs. These rules forbid constructs that are confusing, complex, or subject to varying interpretations by human readers and compilers.
The resulting code is more portable and predictable. Bugs are forced to sit in plain sight instead of hiding behind intricate constructs, and therefore more likely to spotted by software writers, inspectors, and analysis tools.
Some projects may be put off by the effort required to learn and apply a large number of rules (the MISRA C guidelines include 141 rules ). To assuage such concerns, Gerard Holzmann, of JPL's Laboratory for Reliable Software, developed the Power of Ten rules.
These ten rules were chosen to deliver the most bang for the buck: the list is short enough to memorize, but covers many sources of bugs in embedded software. Here's a synopsis of the Power of Ten rules (full details are available at spinroot.com ):
1. Restrict to simple control flow constructs. Do not use goto statements, setjmp, longjmp, or recursion.
2. Give all loops a fixed upper-bound.
3. Do not use dynamic memory allocation after initialization.
4. Limit functions to no more than 60 lines of text.
5. Use minimally 2 assertions for every function of more than 10 lines.
6. Declare data objects at the smallest possible level of scope.
7. Check the return value of all non-void functions, and check the validity of all function parameters.
8. Limit the use of the preprocessor to file inclusion and simple macros.
9. Limit the use of pointers. Use no more than 1 level of dereferencing per expression.
10. Compile with all warnings enabled, in pedantic mode, and use one or more modern static source code analyzers.
Reinforcing each other
Coding rules and static analysis tools are most effective when they work together. In addition to reporting bugs like null pointer dereferences, static analysis tools can flag violations of coding rules.
This automates much of the work of checking compliance with rules, freeing inspection teams to focus on higher-level concerns such as algorithm design or meeting project requirements.
Our tools, for example, come with built-in support for the Power of Ten rules. It is one of several tools used at JPL to check coding rule compliance and to search for bugs in software being developed for upcoming missions.
Coding rules can also make analysis tools more effective. The Power of Ten rules discourage the use of features—such as recursion and indirect functions—that make it more difficult for static analysis tools to accurately infer a program's behavior. (Human software inspectors will appreciate the clarity and simplicity of the code too. )
To reduce bugs in embedded software, complement your existing code quality practices by adopting appropriate coding rules and using a modern static analysis tool. The Power of Ten rules are a short but effective set of rules that are easy to adopt. Select a static analysis tool that works with your coding rules so that engineers can see rule violations alongside other problems detected by the tool.
Michael McDougall is a GrammaTech Senior Scientist. Dr. McDougall received a B.Sc. in Mathematics and Computer Science from McGill University in 1997 and a PhD in Computer Science from the University of Pennsylvania in 2005. His graduate work focused on applying formal reasoning to problems in software engineering and security. Since joining GrammaTech, he has led research projects on the use of software analysis and visualization techniques to achieve high-quality software .