When will we get serious about writing great code? Crummy development techniques are the norm, resulting in far too many products that just don't work right.
I've collected embedded disaster stories for years, hoping that sharing them will scare developers into changing their practices. But each disaster is quickly forgotten. Sure, people die when an X-Ray machine goes bonkers or a plane crashes. In two or three days the story winds up relegated to page 5; a week later it fades from memory.
So it seems nothing will change till a truly horrific software-induced failure occurs. What would that be? Maybe a firmware bug that stalls ten million vehicles on the highways. Collapse of the banking system. A month-long nationwide power outage.
Or accidental sobriety.
According to a story from Jackson, Mississippi, entitled “Glitch Forces Liquor Distribution Center To Shut Down” and published on thejacksonchannel.com, computer problems in Mississippi have shut down the entire state's booze distribution system. Liquor stores reportedly cannot replenish their supplies until the computer glitch gets resolved, which might take weeks. (The article is no longer available on that site, but if you Google the title, you can access the cached version.)
Surely most Mississippians have at least a few extra bottles stashed away for emergencies, but as this tragedy continues to unfold those won't last long. Mississippians will sober up.
That can't be good news for the government. Suddenly citizens will realize their kids aren't being educated, taxes are too high (especially on liquor), and nattering politicians pepper the TV with promises of the impossible while pandering to anyone with lots of cash.
Count on a revolution. Locals will take to the streets in demonstrations while entrepreneurs set up stills to fulfill demand. Liquor tax revenues will plummet, endangering the government's stability.
The boys in Washington are no fools. They'll mobilize the National Guard to airlift in emergency supplies. Watch software consultants' fees skyrocket after the President declares a state of emergency; no price will be too high to get the state's computers and booze supplies back to normal.
It'll be the start of a software revolution as well. The anarchy engendered by the buggy software will find both Republicans and Democrats vying for more restrictive development techniques. That won't be enough to satisfy angry (and thirsty) voters. This catastrophe will be a catalyst for new political parties. The Agiles versus Big Up Fronts. XPers against Feature Driven Designers. A code inspection in every product and a six-pack in each garage will be the new promise made to an electorate suddenly concerned about software engineering.
I'm looking forward to televised debates. Forget tax and spend arguments. It'll be use-cases or design by contract, C# vs. Java, and managing legacy code instead of managed health care for seniors.
You can count on one thing: with all of these techies running, the vote count will be right.
Fallout from my disparaging remarks about C
On another topic, I recently wrote about my misgivings with the C language. That article swamped my inbox with email, some are posted with the piece itself.
I note three patterns in the replies. The first group wrote: “right on!”
A second consists of people who have used Ada. They were unanimous in praising the strengths of that language. Without exception, all who wrote felt Ada's onerous constraints gave them better code. Generally I expect range of opinions on anything to do with developing embedded systems (put two programmers in a room and you'll get three strong opinions). But the Ada contingent all same the same song. In all my years in this industry I've never seen so many people united about anything.
I have never used Ada on a real project, but this unified front should make us all think.
The third group disagrees some vehemently mostly making the point that the language itself is not at fault. Because C gives us flexibility that programmers abuse condemns the developers, not the language. Hey, I totally agree! There's no question that lousy code stems from shoddy work. But, fact is, few embedded developers are at all religious about employing any practice that results in safer C. Most don't even use Lint; an astonishing number avoid version control systems and practices. So to those who feel C is the über-language, tell us what you do to ensure your firmware is correct. I'll post the interesting replies.
Jack G. Ganssle is a lecturer and consultant on embedded development issues. He is conducting a seminar about building better embedded systems on December 5. Contact him at . His website is .
Followup: a friend just sent this web site: The Top Ten Waysto Get Sc***** by the C Programming Language, by Dave Dyer.Interesting reading!
Count me in on the second group – I have used Ada, and I agree wholeheartedly with your comment to theeffect that “If you can get the (expletive deleted) thing to compile, it'll probably run.” The compiler isfar beyond persnickety – by design – and must be to pass the AJPO validation suite.Actually, Ada was developed to solve the military's programming problem – weapons programs and consequentlythe computer programs that drive them last for 20 to 30 years, and programmers turn over every three. Consequently, most programming in that environment consists of reworking someone else's code, which in notmy idea of recreational activity in any language. With a language like C, it becomes an angina gluteusmaximus in a very short time.
Ada provides a structure that will read almost like English if the style book is followed. No obfuscationallowed. It's easy to see what an Ada program does if it has been written using anything close to the stylebook rules. I consider it the only proper programming language for embedded systems that are safetyrelated. It's the way to go when you are writing code for your successor to modify.
I wrote an embedded application in an Ada-83 subset language that has been in service since 1988 with noproblems and no updates required. The development system has long since failed (anybody have any 8 inch disk drives?), but the target machinesare still running 24/7.
– Phil Spray
Your latest piece on embedded.com about writing crummy code struck home for me. I have been writingsoftware for a living since 1976, working primarily on real-time and embedded systems since 1981. I havehad 9 different employers in that time period and have spent I don't know how many man years reverseengineering badly written, undocumented software just so that I could get a set of requirements inpreparation for replacing it. I had to do it with 25,000 lines of buggy Z80 assembler language in 1985 sothat I could make a medical instrument work properly and then move the application to a different computingplatform. I had to do it with 10,000 lines of C in 1990 so that I could re-design and rewrite part of anetwork file system to make it work properly. I have done it again and again since then and am currentlyreverse engineering 100,000 lines of error-ridden, undocumented, byzantine C code in preparation for movingthe application onto a new hardware platform.
What I see here is that we have many people in the software business who have no training in softwaredevelopment processes or software verification. They delight in the clever hack and in the instantgratification that software development can provide, but they have no training in writing testablerequirements, designing test cases, conducting code inspections or design reviews, or implementing softwareconfiguration management. Many developers are self-taught with no formal training in computer science orsoftware engineering. Project managers are promoted out of the developer ranks with no training in projectplanning, management, metrics, or quality engineering. Even when we have advanced degrees, the degreeprograms usually don't include coursework in system verification and validation. The fellow who wrote thatawful Z80 code all those years ago had a Ph.D. in physics and an M.S. in computer science. He could expoundon how to design a compiler but couldn't tell you how to build and verify a simple application program.
We write crummy code because we don't know any better. Most of the time we don't even know we have aproblem and accept the poor quality as the normal state of affairs. After all, if Microsoft with all of itsresources still can't produce a secure version of Windows then buggy systems must be inevitable.
A large part of the crummy code problem is one of training, perception, and misconceptions. I don't know ifthere is a solution in the short term. I have been harping about software quality at my various employersfor years but my entreaties played to indifferent ears most of that time.
Keep on complaining about crummy code. It can't hurt and it may wake some of us up.
– Steve Hanka
My first contact with C was in 1987, by reading one magazine supplement thatwas written by the colleague of my.That booklet was starting with the words that are still shining in mymemory: “C is a very dirty language that allows negligent programmers tocreatemess that is very difficult to resolve. Think for a moment about triple orquadruple pointers.”
My second contact was (in)famous book by Kernighan and Ritchie. Firstedition. First edition that starts with:
“1.1 Getting Started:The only way to learn a new programming language is by writing programs init.”
(Funny, I have never seen a book that says: “The only way to learn how tobuild a bridge is by making one.”
That was followed by 'Hello world” program that in the first edition did nothave “#include stdio.h” as a first row.
At that moment, I started to be frustrated with C, and anything else that isrelated to C, and that frustration grew so much,that I became an expert in C-code archeology, and debugging other people'slegacy code. Is that fun? Would youto ask a coroner if he has a fun job?
– Vladimir Pavlovic
I'm with the second group, kind of. I'm definitely a believer in Ada, but I wouldn't say that “Ada's onerousconstraints gave them better code”. You can do essentially anything you want in Ada (including unchecked typeconversions) but you have to make a conscious effort to do something potentially dangerous (by using specifickeywords), whereas in C that happens all by itself. Like carpenters say, you need the right tool for the rightjob. A hatchet can cut most anything with enough whacks, but a saw will give a cleaner cut. Is that moreonerous? (Just to be clear on the analogy, C is the hatchet job, so to speak :- )
– Rob Neff