The education of embedded systems software engineers: failures and fixes

Robert Dewar, New York University and Adacore

March 18, 2012

Robert Dewar, New York University and Adacore

The dilemma of code reuse
In the real world, programmers try to invent as little as possible; the idea is to use existing code and available libraries. The ability to investigate, evaluate, and make use of such software components is a critical skill, but again one that is simply not taught.

Instead, either of two extremes is taken. On the one hand, students are told they must not use code they don’t write themselves or it will be considered plagiarism and cheating.

So programmers come out of school with a severe predisposition to the NIH (“Not Invented Here”) syndrome that is often the root of unnecessary work and cost.

The other extreme may be termed “reuse without thought”. A student is given the job of adding a new widget to a Graphical User Interface and is able, with minimal code, to produce a fancy window-based application in minutes.

The result may look impressive, and the exercise may be considered an effective example of reuse, but it does not indicate how well the student understands the technical principles behind the pretty effects: threading issues, graphics technology, algorithm efficiency, etc.

Is it safe?
Critical systems in transportation, nuclear reactors, medical devices, and other domains depend on correctly functioning software. For those who think that software intrinsically has bugs, such systems seem to be ticking time bombs destined to bring about inevitable catastrophes. However, the track record in safety-critical industries such as aircraft avionics has been notably successful.

Despite some close calls (e.g., [5]), not one life has been lost on a commercial aircraft because of software failure. How is this achieved? The short answer is that very competent developers employ well-established practices, and safety certification standards (DO-178B for commercial avionics [6]) do an effective job in checking that such practices have been followed.

Interestingly, DO-178B has recently being updated based on experience, and the new DO-178C standard [7] recognizes the growing role of technologies such as model-based design and formal methods. Although Computer Science departments overseas seem to be evolving to encompass such developments in their curricula (for example the new program at Newcastle University in the UK) the same cannot be said of the academic community in the US.

Beyond safety, the growing relevance of security in software (worrying about malicious attacks, including cyber-terrorism) introduces new concerns [8].

Building secure software requires new techniques and paying attention to security at the earliest stages of development: you cannot effectively add security in as an afterthought.

In particular the use of formal methods based on mathematical analysis play a critical role in systems at the higher Evaluation Assurance Levels of the Common Criteria [9], so that developers can formally demonstrate relevant security properties of their software. Unfortunately, to follow up on a point made earlier, Computer Science students all too often lack the mathematical background to address these requirements.

Some may feel that safety- and security-related issues are too specialized to address in undergraduate courses, but that position is becoming less and less tenable.

In today’s interconnected world, more and more systems will come to be recognized as raising safety and/or security issues. To paraphrase one of the famous lines from President John Kennedy’s inaugural address from fifty years ago: “Understand both what the network can do to your system, and what your system can do to the network.”

Moving forward
The concerns about the education of embedded systems software engineers are not simply theoretical. As one industry spokesman pessimistically states [10]: “In industry surveys, over 80% of embedded software developers report using C or C++ as their primary programming language. Yet as a group, these programmers earned a failing grade on a multiple-choice quiz testing firmware-related C programming skills. “

A scary result, considering that embedded software inside medical devices, industrial controls, anti-lock brakes, and cockpits place human lives at risk every day.

One might hope that such issues would be addressed in current curricula revision efforts, for example the ACM/IEEE Computer Science Curricula 2013 [11].

This work is an attempt to keep academia in synch with the technological evolution in the computer field. But it basically proposes a framework that is expected to be adapted by individual institutions, and in any event it still does not adequately address the issue of mathematical prerequisites or many of the other “real world” issues described above.

Progress will require some serious rethinking of both the goals and the tactics of Computer Science education. Partnerships between industry and academia will help, so that “lessons learned” from practice can become material for case studies.

The choice of programming languages, especially in introductory courses, needs to be based on pedagogical merit rather than popularity: the language du jour can quickly become passée. Mathematics courses need to be better integrated into the Computer Science syllabus.

Real-world Issues – team projects and software development processes for large systems – need more attention. And given the domains in which embedded systems operate, students need to thoroughly understand the techniques for producing demonstrably correct, safe, and secure systems.

With these kinds of adaptations to Computer Science curricula, perhaps the phrase “software engineering” will no longer seem to be a contradiction in terms.

Dr. Robert Dewar is co-founder, President and CEO of AdaCore and Emeritus Professor of Computer Science at New York University. With a focus on programming language design and implementation, Dr. Dewar has been a major contributor to Ada throughout its evolution and is a principal architect of AdaCore’s GNAT Ada technology.

He has co-authored compilers for SPITBOL (SNOBOL), Realia COBOL for the PC (now marketed by Computer Associates), and Alsys Ada, and has also written several real-time operating systems, for Honeywell Inc.

References
[1] Dewar, Robert and Edmond Schonberg. Computer Science Education: Where Are the Software Engineers of Tomorrow? January 2008. .http://www.crosstalkonline.org/storage/issue-archives/2008/200801/200801-Dewar.pdf
[2] ISO/IEC JTC 1/SC 22/WG 23. Programming Language Vulnerabilities. grouper.ieee.org/groups/plv/
[3] Brinch Hansen, Per. Java’s insecure parallelism; 1999 brinch-hansen.net/papers/1999b.pdf
[4] Brooks, Frederick. The Mythical Man Month. Addison-Wesley, 1982.

[5] Evans, David. “Safety: Safety-Proofing Software Certification”, AvionicsToday, January 2006 www.aviationtoday.com/av/commercial/Safety-Safety-Proofing-Software-Certification_703.html
[6] RTCA SC-167 / EUROCAE WG-12. DO-178B – Software Considerations in Airborne Systems and Equipment Certification; December 1992
[7] RTCA. DO-178C – Software Considerations in Airborne Systems and Equipment Certification; December 2011.
[8] Cybersecurity. www.dhs.gov/files/programs/cybersecurity.shtm
[9] Common Criteria for Information Technology Security Evaluation. Part 3: Security assurance components. Version 3.1, September 2007. www.commoncriteriaportal.org/thecc.html
[10] Barr, Michael. Embedded Programmers Worldwide Earn Failing Grades in C and C++. www.embeddedgurus.net/barr-code/2009/11/

[11] The Joint Task Force on Computing Curricula, Association for Computing Machinery and IEEE-Computer Society. Computer Science Curricula 2013 (Strawman Draft); February 2012; ai.stanford.edu/users/sahami/CS2013/strawman-draft/cs2013-strawman.pdf

< Previous
Page 3 of 3
Next >

Loading comments...

Parts Search Datasheets.com

KNOWLEDGE CENTER