Software MUST be safe and secure - Embedded.com

Software MUST be safe and secure

While 2008-9 seems like a fast receding bad dream, there is the occasional flashback. Some events and shows are still reporting lower attendance figures, but most seem to be back to 2007 levels and the industry seems to be moving again, with projects starting or at least picking up.

This turnaround must be an interesting dilemma for some. I understand that some silicon companies have full order book to the end of the year; unless they open (or re-open) their mothballed fabs, which is an expensive decision in either direction.

It's gotten so bad that people are actually counterfeiting MCU chips! Spakrfun in the US managed to buy a whole reel of fake Atmel AVRs from a Far East supplier and it appears from other comments that this is not an isolated incident. As I've said so many times before, use trusted suppliers and if it seems too good to be true it probably is. Nothing is free in this life.

Another story highlights some of the problems enforcing software licenses. In this case, the author of the software has said, “All current development of AKRip has been discontinued due to increasing amounts of license violations.” This was for some LGPL code. So it seems that whilst many are using GPL code, they are not adhering to the GPL license terms. For some, it's simply because they “know” that open source means “free” and like everyone else, they don't bother to read the license. (How many of you have read the EULA for the software on the machine you are using now?). However, I think many cases are intentional and not a simple misunderstanding. There has been enough written about the GPL V2/V3 licences in the technical press for no one to claim ignorance, even without the license that comes with the software. If you want to use software, you must abide by the licence. How much the stricter enforcement of the GPL license(s) will affect use of open source in the market remains to be seen. For many commercial users, the GPL can be more restrictive than most commercial licenses.

Looking for other dubious activities, I came across a story about hacking automotive systems. Well this has been going on for a while (e.g. “chipping”), the concern now is that with so many networks in a car, it's possible to change most of the car's parameters. And with RF links, this can be accomplished from outside the vehicle.

At one time, viruses and hacking were confined to large mainframes and any networking was handled by telephone lines and some leased lines. Then things moved to PCs, which whilst time consuming, usually didn't have any physical effect in the real world. In those days, embedded systems were immune as they were black boxes, running bespoke systems that were not usually networked to anything else.

Now, large numbers of embedded systems are networked, or at least communicate with the outside world. They have standard interfaces such as TCP/IP, USB, and CAN bus. This makes it easier to get a connection and start hacking, often from somewhere remote, into the equipment. So not only do embedded engineers need to work on reliability and safety, but security too.

When ISO started some years ago, the WG23 Vulnerabilities panel looked into Language Vulnerabilities as the US was focusing on security, largely because post 9/11 funding was easier for anything with the word security in it. The rest of the world was looking at it more from the safety and reliability angle. Whilst there is a huge overlap, there are some differences. One is more a case of: anything that could go wrong will go wrong. The other is: if someone can intentionally exploit a weakness, they will. We are not just talking buffer overflow.

There was a bit of a scare along these lines recently when it became public that Google, whilst taking the photos for its street-level mapping, had also been picking up WiFi. They said that they were picking up SSIDs and MAC addresses but no data. Now it transpires that they had been picking up data from unprotected networks.

I can't understand why this was even a story. Surely everyone by now uses strong encryption, doesn't broadcast an SSID if they don’t need to, and turns off their WiFi when not in use. Last year, I suggested that offices set their WiFi to switch off outside of office hours, home users turn off when everyone is at work, and don't use DHCP rather than but manual addressing with strict MAC binding.

Going back to the point on doing it right, take note of Jack Ganssle's column on this site, which looks at peoples favourite tools. It's worth musing over that after the compiler comes the debugger, then the oscilloscope, IDE, JTAG, logic analyzer, and ICE. At the bottom of the list is software testing tools, UML/graphic design tools, and static analysers. This is frightening!! The favourite tools are the “bug hunters” not the tools that stop the problems before they become bugs.

UML tools usually have rule checkers in them to prove the design. Static analysers remove many problems before compiling. Software testing tools and system testers can also prove the system. If used properly, they can quickly isolate any problems, especially if most of the problems have been removed by “running” the UML (or other) models and running good static analysis. I must admit that I have a soft spot for ICE or emulators (less so for the restricted JTAG/BDM debuggers), as in many cases you can use them with unit and system test tools and often standalone with their own (usually C-like) control language to test in real time on the hardware.

If we are going to build software with an eye on security and another on reliability and safety, then we need to build it correctly and not just throw it together and then bug hunt and hope we are better at it than lady luck and “the enemy.” This points me back to the many times I have written about licensing software, Standards: Licensed Engineers and Standards Who cares.

Note that in 2006, the US FDA’s Center for Devices and Radiological Health (CDRH) reported that 21% of all medical device recalls were for software defects. It's also estimated that one-in-three software-based products is recalled.

There's now a bill now before the US Senate that would require senior officers or directors of drug and medical device companies to certify under penalty of perjury that all information submitted for a products approval is accurate and in compliance with federal regulations. Product applications later found to have contained false or misleading information would be subject to stiff fines (up to $5,000,000), assessed both to companies and their senior officers, who could also face jail sentences of up to 20 years.

That looks very similar to the UK Corporate Manslaughter Act that came into force two years ago that requires a duty of care by a company to make sure things are done properly and imposes fines and gaol sentences on directors and senior managers. This is for any manslaughter as opposed to a murder. Unlike the US bill that is for medical software, the UK version applies to any involuntary death. Hence for example, a death caused by a car accelerating or failing to stop due to a software bug, the manufacturers and individual directors could be put in court and imprisoned for manslaughter, rather than a fine for the company.

It's high time that software engineering grew up and started engineering properly and not playing about like a bunch of undergraduates. Software directly affects far too many lives in significant and important ways not to get it right.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.