Verifying embedded software functionality: Why it’s necessary
Editor’s Note: In this four part series Abhik Roychoudhury, author of Embedded Systems and software validation, explains why it is important for embedded developers to about new techniques such as dynamic slicing, metric based fault localization and directed testing techniques for assessing software functionality. In this Part 1: what must be done and how to achieve it.
Embedded software and systems have come to dominate the way we interact with computers and computation in our everyday lives. Computers are no longer isolated entities sitting on our desks. Instead, they are nicely woven and integrated into our everyday lives via the gadgets we directly or indirectly use—mobile phones, washing machines, microwaves, automotive control, and flight control.
Indeed, embedded systems are so pervasive, that they perform the bulk of the computation today— putting forward “embedded computing” as a new paradigm to study. In this series, we focus on validation of embedded software and systems, for developing embedded systems with reliable functionality and timing behavior.
Not all embedded systems are safety-critical. One one hand, there are the safety critical embedded systems such as automobiles, transportation (train) control, flight control, nuclear power plants, and medical devices. On the other hand, there are the more vanilla, or less safety-critical, embedded systems such as mobile phones, HDTV, controllers for household devices (such as washing machines, microwaves, and air conditioners), smart shirts, and so on.
Irrespective of whether an embedded system is safety-critical or not, the need for integrating validation into every stage of the design flow is clearly paramount. Of course, for safety-critical embedded systems, there is need for more stringent validation—so much so that formal analysis methods, which give mathematical guarantees about functionality/timing properties of the system, may be called for at least in certain stages of the design.
Our focus in this series is on what has been learned about software validation methods, and how they can be woven into the embedded system design process. Before proceeding further, let us intuitively explain some common terminologies that arise in validation—testing, simulation, verification, and performance analysis.
Testing refers to checking that a system behaves as expected for a given input. Here the system being checked can be the actual system that will be executed. However, note that it is only being checked for a given input, and not all inputs. Simulation refers to running a system for a given input. However, simulation differs from actual system execution in one (or both) of the following ways.
• The system being simulated might only be a model of the actual system to be executed. This is useful for functionality simulation—check out the functionality of a system model for selected inputs before constructing the actual system.
• The execution platform on which the system is being simulated is different from the actual execution platform. This situation is very common for performance simulations. The execution platform on which the actual system will be executed may not be available, or it might be getting decided through the process of performance simulations. Typically, a software model of the execution platform might be used for performance simulations.
Formal verification refers to checking that a system behaves as expected for all possible inputs. Because exhaustive testing is inefficient or even infeasible, verification may be achieved by statically analyzing a system model (which may be represented by a structure such as a finite-state machine).
Finally, we note that formal verification methods have conventionally been used for giving strict mathematical guarantees about the functionality of a system. However, to give strict guarantees about performance (for example, to give an upper bound on the execution time of a given software), one needs to employ mathematical analysis techniques for estimating performance. Such techniques often go by the name of performance analysis.