My love-hate relationship with C

October 16, 2003

JackGanssle-October 16, 2003

C, the most popular of all embedded languages, is an utter disaster, a bizarre hodgepodge meant to give the programmer far too much control over the computer. C++ isn't much better. The languages are designed to provide infinite flexibility, to let the developer do anything that can be done on the computer.

Don't get me wrong -- I do like programming in C. Assembly is even more fun, which proves I'm some sort of computer gearhead, more fascinated with the system's internals than with actually delivering products.

But no language should allow stupid mistakes like buffer overruns or undetected array overflows.

Geodesic claims 99% of all PC programs (most written in C and C++ of course) have memory leaks, all caused by poor use of malloc() and free(). Though these constructs are less common in the embedded world, an awful lot of firmware does use dynamic memory allocation. The language should simply not permit leaks; checks to guarantee memory integrity are essential. The cost is minimal. (Check out mem.txt at snippets.org, a simple bit of code you can link into your embedded app to detect all sorts of malloc()-related problems.)

Pointers are utterly unbounded in C. Want to dereference a null pointer? Go ahead! The language cares not a whit. Feel free to do any sort of math to any pointer. It's fun!

Here's a C hint that will improve your job security: embrace double indirection. Even better, try triple. Real programmers use quadruple. The only limit to the number of asterisks placed in front of a pointer is the size of one's cojones or how adventurous you feel.

Exception handlers are totally optional in C. Sure, they're nice to have, but the language itself does nothing force us to either write such handlers, or to write them in a way that's likely to work.

Even something as simple as integer math produces unexpected results: 20,000 + 20,000 is (ta-da) a negative number. Is this cool or what!

C has no formatting rules. It's easy and usual to write source in astonishingly cryptic ways. Any language that allows utterly random use of the ENTER key (it's perfectly legit to hit ENTER after almost every character in C) is more an encryption tool than an aid to creating reliable and maintainable code.

No other language has an obfuscated code contest. Win by writing code that works but that's so convoluted no C expert can understand why. Most of the entries look like a two year old hit a few thousand random keys. And no, I'm not putting the URL of the contest here; these people are code terrorists who should be hunted down and shot.

A great programming language should encourage users to create perfect code. It must bound our options, limit our freedom, remove the degrees of freedom that lead to stupid bugs. Ada did this. It was so syntactically strict that the rule of thumb was "if you can make the damn thing compile it'll probably work." The language itself served as a straitjacket that led to reliable firmware. So of course Ada more or less failed, now barely appearing as a blip on language usage surveys.

Other options exist. The MISRA folks have a set of rules that limits use of dangerous C constructs. Cyclone is a sort of C dialect that leads to more correct code. Neither has much market presence.

C sucks. Sure it's fun and efficient. Debugging is much more of a kick than taming an angry syntax checker that insists on correctness. But it's time for the entire firmware community to either embrace a restrictive dialect of the language, or to adopt an Ada-like lingo that implicitly leads to correct code.

What do you think?

Jack G. Ganssle is a lecturer and consultant on embedded development issues. He conducts seminars on embedded systems and helps companies with their embedded challenges. He founded two companies specializing in embedded systems. Contact him at jack@ganssle.com. His website is www.ganssle.com.

Reader Feedback


I believe that it's more of a balancing act - that is, safe and correct coding. Where I work there is a great deal of thought and effort going on to "get it right" and that is great. However, making the tools "fool proof" or "safe" is rather difficult to say the least. Ever ride in an aircraft? Consider the training of the pilots and crews in that "very dangerous" environment. Airbus set out to "make it safe" for the pilots. The tool killed a few of them. It's a tough call in general. I personally like freedom when designing and coding. But issues of safety are of the most important nature and preventing stupidity is important. The problem with preventing stupidity can come about when stupidity is trying to prevent stupidity. This happens quite a bit in organizations a nd we all think we're smart (right?). My best bet on keeping things upside right and safe is education and trainging. It has to be right or you're a goner by the way. Pilot experience speaking here. Wrong training produces imcompetance and disaster. As for the "tools" of the trade. Sure would be nice to be able to feed experience into the language/compiler so that the whole thing just doesn't become some sort of leagalistic bandaid. And those sorts of fixes tend to have REALLY heavy and immovable anchors. In aviation there is a rule that is quite interresting (especially coming from the FAA): In an emergency the pilot in command may break any or all of the rules in the interest of the best outcome for the pilot, passengers and crew. The pilot may be asked to provide some documentation regarding that decision by the way. The point is, the pilot is trained to deal with the tools and the processes (including querying ATC when they ask you to do something that is unsafe - happens). So, for now, I would rather be hard on the programmers/engineers and their training. Adjust the tools or tailor them. No straight jackets PLEASE...

- Scott Miller


I am totally blown away! For the first time someone has spoken the truth about C for what it is: A TOY FOR TINKERING WITH THE COMPUTER!

I have only programmed in (Embedded Visual)C(++) when it was necessary. I always viewed C as a toolbox with lots of tools, many of which had the same purpose but with a slight nuance. (NIHS). I spent too much time figuring out the nuances instead of programming. What really soured me was a time sensative project when I increased the size of a string without increasing the size of the buffer. The system just crashed with no indication of what was wrong. Took me all day to find it. I made a resolution right there to stick with Pascal. At least it would have truncated the string and I could have spotted the trouble right away. This is especially true when you are making changes under the gun.

Thanks, Jack, for vidicating me in all those discussions with C GURUs.

Let the boys have their C toys. But as for me , a professional programmer, give me Ada or Pascal so I can get the job done in a timely manner and maintain the code for years to come.

- Frank Putnam, Jr.


Disliking C because it doesn't do enough to prevent errors is like disliking English because it allows you to say stupid things at parties or hurt your spouse's feelings. And by the way, having your java program try to limp along after yet another "Null pointer exception" has trashed some key data structures isn't really doing your system any favors. Don't blame the language. The problem is software development culture that isn't obsessed with objective risk assessment and mitigation. I've seen way too many coding standards docs for C, C++, Java, assembler, whatever, that focus on things that have nothing to do with real risks that need to be addressed in the software's development, testing and operation.

- Presley Barker


At last ! Someone who sings from the same hymn sheet as me. I only ever use C if there is no viable alternative language available for the processor in question. My preference is for Pascal / Delphi, alas not avialable on most microprocessors. Even so, when I use C, I try to keep to Pascal style formatting and language constructs. C should have been strangled at birth, and certainly never released into the wild.

- Andy Syms


It gets lonely sometimes,programming in Ada that is. Its refreshing to see a 'c' freak like yourself give Ada some credit. Thank you.

- William J. Thomas


What is going on with ADA 95? I talk to people who use it and they seem to like it. The GNAT freeware compiler supposedly works. I haven't seen an article on ADA for years! Have you come across projects/companies that use ADA outside the aerospace industry? Maybe ADA based projects don't require calling in outside help to save the day.

- James Munn


Thank you so much for that wonderfully accurate article "My Love-Hate relationship with C". I was weaned on computers through the use of Pascal and its beautiful set of structured programming requirements, bounds and limits. Throughout my succeeding years I've been forced to use C occasionally and hated every minute of it. In the early 90's I worked for Raytheon and wrote programs in Ada. That was a pleasant joy. I wish those who thought that C is the 'only' programming language, would take off their blinders. I'm tired of trying to debug other people C-code.

- Russel Buckley


Well, I think this is the first time I've read one of your columns and didn't totally agree with your point! While no one can deny that C allows you to shoot yourself in the foot, does that mean it sucks? No way!

I built a deck in my backyard and used several very essential tools, each of which, had I used them improperly I could have done a lot of damage to myself. Does that mean those tools suck?

A police officer carries a gun, which he could potentially use to shoot his partner. Does that mean the gun sucks?

My wife goes to the grocery store and buys a few cans of soup. The soup can doesn't prevent her from throwing it at another shopper does it? Does that mean the soup can sucks?

Millions of folk drive to work everyday, in vehicles that could be used as weapons. Does this mean that cars suck?

Just because something is designed in such a way that it implies that the user will take on some of the responsibility doesn't mean that the original design is bad. I don't disagree with any of your points regarding some of the improvements that could be made on the language, but that doesn't mean that the language, which has essentially created our industry, sucks!

I especially hate browsing source code and seeing a variety of different "styles", or in many cases a total lack of style. This makes me crazy, but I admit that I probably do it sometimes myself especially with older (pre-ANSI) source code that I've migrated into newer projects. I take responsibility for that needed cleanup, it's not C's fault!

Regarding other runtime checking, yea it's nice to say that we need it now, but back in the days when the CPU frequencies were under 10Mhz, it was good to know that the only runtime checking that was done was what I put in the code.

Anyway, I just find it to be a bit harsh to say that "C sucks".

As always... Fun stuff!

- Ed Sutter


I just finished reading with great interest your article "My love-hate relationship with C." I share you amazement at the flexibility without bounds that this language offers. My first experience with C came when I changed jobs in 1991, moving to an early networking company from a defense contractor where my last project had been coded in Ada.

The differences between Ada and C were staggering. With Ada, there were rules, lots of rules. And most of the rules were good rules. Let's face it, we're all human, and we make mistakes. It's a lot cheaper to let the compiler catch the mistakes early on rather than catching them during code review, unit testing, integration, or beta testing. And Ada made you modularize your code, and made you publish clean APIs. And it allowed you to publish your APIs so that others could write code to those APIs. And others could write their own stub implementations of the APIs to test their code. In the article you commented "if you can make the damn thing compile it'll probably work." That was my experience. It took a lot longer to code in Ada, but the backend time for testing and integration was significantly reduced. And it's not like Ada didn't allow you to do real embedded code. After all, I was writing device drivers and interrupt service routines that used both I/O ports and memory-mapped I/O, so I wasn't handcuffed from that point of view. In fact, I'd say the only negative was that there were only a handful of compiler manufactures, and the code that was produced was "bloated," and as a result, ran slower than expected (compared against an earlier, similar project coed in PL/M). But that wasn't a fault of the language per se, just a need for the support tools to mature.

When I started to code in C, I was definitely surprised at the lack of limits on what one could do. Lack of array limits, pointer dereferencing and casting, assignments and arithmetic using mixed types, and functions without prototypes were just a few of the features/bugs that were part of the proverbial rope that was available for me to hang myself with. I remember looking at my first piece of code which I had to modify, code which contained a function call via an array of structs which had a function pointer field, and thinking C, like APL, sure looks like a write-only language because it sure was hard to understand what the original coder was trying to do. And that's from someone who prior to coding in Ada was writing assembly language programs using the Fairchild 9445 extended-Nova instruction set (ah, memories; nothing like an instruction set which allows you to shift, perform arithmetic, and branch all in one instruction -- but I digress).

I'm simply amazed at how difficult it is to write bug-free C, and how easily the most sinister defects can creep into the code. But I guess that for those of us who have a track record of delivering high quality code, an employment opportunity is always out there!

Thanks for you insightful articles!

- Jeff Johnson


I agree with you on all your comments, but you need some thing so powerful ( As Powerful as Aseembly ) and allows you to create maintainable code, C has both the things, Linux demostrate this as entire OS runs but developed by different programmers across the world, It is simple to to use or even shoot your self, just take care of your self! you are the best person to do that than some thing else!!!!!

- Anne Ajaya


Stop whining! If you like C, learn to write good code, and/or build around the primitives that it provides. If you don't want to do that, then you might as well have someone else do your programming for you, 'cause you're never gonna be satisfied.

If you don't like C, use something else.

-Rohit Patil.


If I could get ADA code to run on a Z8 or AVR I'd love to use it.

Cyclone is hampered by the bazaar licensing of AT&T (who every they might be today) [Might have changed licensing by now I hope].

I'm going for Esterel myself, it will run on the AVR, and will know if runs on the Z8 shortly. If you can do a Airbus plane with it, you should be able to do most embedded projects with it.

A open source version of Esterel is being worked on.

Check out the mess they made of the 'safe' language R++ under Software Patents at http://www.softwaresafety.net/ .

- Bob Paddock


Jack, I don't often agree with what you say, but on this point I agree completely. I love 'C', but complex systems (systems over 100K lines) should not be written in this language (except by the absolute creme of implementors, led by the creme of designers). Ada would be best, but as you state not practical due to the lack of popularity.

Java is the only commercially viable choice for something with at least 'C's more glaring deficiencies eliminated.

Describing a compiler with strong checking as a straight jacket is inappropriate (as it implies that it reduces your ability to function); an engineered programming language is more like a Volvo; it doesn't impinge on the effectiveness of function - it get's you anywhere you want to go - but Mario Andretti would probably choose to drive something a little less safe (and he could pull it off without any problem).

Most programmers know this, the only problem is everyone thinks they are Mario Andretti (just like 80% of Americans think they have an above average income).

- Rennie Allen


Although I agree with your disposition on C, Ada (83) never promised or delivered implicitly correct code. There were fun ways to abuse it, such as UNCHECKED_CONVERSION. And the standard's lack of built-in string handling functions and a decent math library were simply unacceptable. The strong points of Ada were the strict language syntax, the implicit "make" that was necessitated by this strictness, and its few object-oriented features. The rendezvous concept was incomplete and most often a 3rd-party RTOS had to be used.

The language certainly was a step in the right direction, though. For a while, I preferred coding in no other language. I used Ada from the get-go. My first Ada application in 1986 ran on an Intel 80186 and utilized what should have been Softech's ALS (Ada Language System) for the U.S. Army. What ultimately killed Ada was not necessarily its lack of acceptance by the programming community at-large, but rather by its lateness to market. I mean, it seemed like a good idea. That is, the Govvie driving the standard and making it so strict that you couldn't even call it an "Ada compiler" unless it passed a rigorous battery of validation tests. The language was flawed, and compiler vendors found them all. The Army ultimately abandoned the ALS contract with Softech because it just kept dragging on and on, always late and over budget. What we used was an unfinished product that produced very buggy code.

Sure, in time there were decent Ada compilers. I can recall a few: VAX Ada 3.0 (my fave), Meridian, and the one I used the most, Tartan. The VAX compiler was my favorite because of its "smart recompile" feature. It actually determined if what had changed in a package was actually referenced by others dependent on it and did not recompile the dependent packages if there was no need. The Tartan compilers, which I used for MIL-STD-1750A and TI DSP development, were okay but we opted for VRTX or some other 3rd-party RTOS over the rendezvous. The arrival of good compilers and development systems simply came too late, and the defense programs committed to using it pressed on and dealt with the pain.

Ultimately, the Govt. attempted to both cleanse the language and move it fully into the Object-Oriented world with Ada95. Besides being a tremendous improvement to the language, this meant an even more complex underbelly (run-time system) and pointed out some of the flaws in Ada83. In essence, the introduction of Ada95 was an admission that Ada83 contained flaws, which seemed to overshadow its new features. It wasn't enough that the target was so hard to hit the first time around, now there were even more rules. The reaction to the release of the Ada95 standard was underwhelming.

The final blow to Ada was the lifting of the Govt. mandate for the DoD and other Govt. agencies to use the language. I am sure what caused the mass exodus off the bandwagon was the relief that defense programs would no longer have to fund the development of the language or suffer delays because of problems with development tools. In a nutshell, time and money did it in. And although the trend of producing somewhat bug-laden software products prevailed, at least they were being delivered on time and within budget again.

I think if the professional standards community had authored MIL-STD-1815A (not by that title, of course) and the commercial industry had been the push behind it, taking away the strict requirement for validation before use on a defense program, Ada may very well have been alive and prevalent in the U.S. software market today. And I don't mean just the defense industry. I'd love to see it come back into vogue. Certainly, the Europeans have embraced it, particularly in their space industry (http://www.estec.esa.nl/wmwww/EME/compilers.htm).

It is possible and quite easy to write good C and C++ code. I do it all the time. But we have to rely on company policies, standards, and processes to ensure it. Discipline has always been a facet of engineering, and C demands it. Companies that let their programmers and software engineers write lousy code are likely spending too much money on software development, particularly rework. But that's their choice (or ignorance).

Ada was the greatest embedded systems language ever conceived. I miss it.

- Bruce Scherzinger


Just read you article and I couldn't agree more. I'm a hardware designer who also happens to really enjoy programming what I design and I like C also but I'm toying with the idea of using Ada next time I have to write embedded diagnostic code for hardware bringup and test.

Ada was designed from the ground up as a language for software **engineering** rather than hacking; maybe that's why Ada is far less popular. Ada certainly has a steeper learning curve and development tools are a little harder to find.

By the way, I like VHDL versus verilog for a similar reason: VHDL is a hardware engineering language. VHDL is to Ada as verilog is to C.

- Keith Outwater



For those wanting to find out about the current state of play in the Ada world, here are a couple of sites:

www.adapower.com & www.adaworld.com - general interest sites;
libre.act-europe.fr - you can download the public version of the GNAT compiler from here;
www.usafa.af.mil/dfcs/bios/mcc_html/adagide.html - a simple, free IDE for the GNAT compiler;
www.adaic.org - the 'official' Ada home :-);
comp.lang.ada - still up and chatting!

To the person going to use a Z8 or AVR, check if there is a 'gcc' compiler for them. If there is then GNAT is a potential Ada compiler.

For the person wondering who is using Ada outside the military here are a couple of the 'cooler' examples:

1. Philips remotely control, from Holland, their Far-Eastern chip fabrication using plants Ada;
2. Check out the Beagle 2 Mars probe - it's pretty much all Ada;
3. If you have digital TV in Europe, there's a fair chance it is transmitted using Ada.

- Martin Dowie


Was this article written because this was an assignment - "to write an article trashing 'C' Programming Language"?

Almost all of the issues raised in this article are the issues of the lack of discipline on part of the programmer, lack of coding guidelines, lack of coding / programming standards etc.

Any Programming language is just like any other tool. It enables us engineers to implement the design. It may not be appropriate to blame the tool for errors that are Human in nature. Any programming language, like all the tools, is only as good or as bad as the people using that tool. We may make a case for tool improvement but should not "hate" the tool.

We should appreciate the fact that 'C' allows the Programmer to be free and be creative. This does not give the programmer a license to make mistakes and then blame the tool.

Response by Ed Sutter said it the best, I fully agree with his views. Good response Ed.

- Vivek Jain


I'm presently working on two projects: one is firmware for a low-end 16-bit microcontroller in C, the other is for a Windows 2000 target using Visual Studio .NET. This is my first .NET project, and I find the environment with its strict syntax checking and automatic formatting to be, as you say: "restrictive", but I also find it to be almost Ada-like in that by the time I get it to compile, it usually also works! (.NET takes this one step further in that the syntax checking is going on while you type. If you can get rid of all the little squiggly underlines while you type, it'll usually compile!)

However, I also find the .NET environment to be very helpful in offering context-sensitive help and popping up lists of syntactically correct choices at any given point in one's typing. It's almost like having that second programmer looking over your shoulder in XP's "pair" programming.

.NET has this thing called managed code with automatic garbage collection which watches out for your array accesses and freed objects and so forth. I can be a lot lazier because the environment does so much of the work for me! The only problem is most embedded systems can't afford the burden of this overhead.

Nevertheless, I feel there is a lot more that could be done at the level of the development environment where the power *is* available, without imposing much if any burden on the target. What if the manufacturers of these lint tools and memory-leak detectors and compilers for small target MCUs got together and integrated these tools into a .NET-like IDE that watched over you as you typed and helped "restrict" you to good programming practice? Might as well put that 2GHz CPU with 512MB RAM and 80GB hard drive to good use rather than idle for a few hundred million cycles waiting for your next keystroke!

- Brad Peeters


C++ is the light-sabre of programming languages. If you don't know what you are doing, you will cut off your own arms and legs. This inherent danger isn't sufficent reason to keep the powerful tools out of the hands of the masters.

- Bob Wise


As a professional software engineer, I think that Mr. Ganssle's rant is a bit immature and I pity the folks that take his seminars on embedded code. If he wants to not take responsibility for anything he does in code then he should go work for the Microsoft virus-writers. Limiting C's useful features--such as no bounds checking on arrays, and being able to dereference any pointer--are useful mechamisms. His whiny complaints are akin to wanting to take away all cars because some moronic drunk might attempt to drive one. Not all tools are made for the same job: want to slap together a simple GUI? Use VB. Need an application to run across many platforms? Use Java. Need to get down to the silicone and actually make a computer dance? Use assembly. That can eat many man hours though, the next best thing is C (or a mix of C and assembly). The right tool for the right job. I have used some of the feautres he complains about in firmware I have written. Sure, pass a function a pointer to a chunk of memory and the function can overlay any struct it wants. It's called information hiding. People rave about C++, but the truth is, anything that can be one in C++ can be done just as well in C.

- Troy McVay

Jack's response:ICs are made of silicon; silicone is used for breast implants...


Jack, if you think C is bad, you obviously have never seen perl. Perl does indeed have an obfuscation contest, and it is much worse than the C obfuscation contests ever where!

As the saying goes, perl gives you enough rope to shoot yourself in the foot....

- Buddy Smith

Jack's response: Buddy, good point. I do use Perl on occasion. And it is indeed worse than C. I usually use Perl when I have only a vague idea how to get something done, and I'm fiddling with regex's and all. Hacking.


Oh boy, I hope your mailbox is big enough. I am sure you will get tons of mail on this subject. The only topic that might bring in even more mail would be "C vs. Assembly Language", which this topic could easily turn into after some discussions :-)

I also have a love-hate relationship with C: I hate the way it "lets" me make mistakes, but I love its efficiency - it writes code in assembly language just about like I would do it. I really cannot fault a compiler for "letting" me make mistakes. If I am in good control of my code, there should not be any mistakes. I should design it properly such that it doesn't do all the horrible things we hear about with C. Having said that, it would be nice to have a more strict C to keep me in check, as long as it did not sacrifice efficiency, and still promoted readability.

To help with code development, I like to use Lint. It cannot catch some run-time problems like array bounds or wild pointers, but it does help with the more mundane things like "if" constructs. It REALLY bugs me when C lets me write "if (a = 5)" when I really meant to write "if (a == 5)". It bugs me because logically the statement is always true, and any good compiler should at least issue a warning. I always say Lint is a super compiler that doesn't produce any code. It can be the most annoying thing you have ever seen, or can be your best tool, depending on how it is allowed to run.

Maybe the problem with C is in its origin. The original writers of Unix may have liked it because they could more quickly code an operating system. Who cared about some strange things it let you get away with - it was a heck of a lot faster than assembly language. And I would add (see here goes the C vs. Assembly Language things again) that C definitely makes it easier to code with less things to worry about and get into trouble with than assembly language. Whatever horrible things you can get away with in C, are even easier to get away with in assembly language. But maybe for all the applications C is used for today, it really should be more strict, not letting the programmer get away with so many terrible practices.

So I would have to conclude that I like C, but would like to see it refined, as long as the improvements were not at the expense of code readability, code size, or execution speed. Well, you asked for it. Thanks for the article Jack.

- Lou Calkins


It may interest your readers to note that while some checks that Ada includes have to be done at runtime, loads of them can be made at compile time and the intent is that for Ada0Y, more will move from runtime to compile time, as compiler techiques improve.

The result? Fast, small code.

There aren't the same Ada83-style-bloatware compilers out there anymore. Ada95 compilers produce excellent code quality, comparible to any other efficient language.

The people at ACT (who sell/maintain GNAT) have examples of how Ada programs can produce *identitical* object code to the equivilent C program.

- Martin Dowie


Loading comments...

Parts Search Datasheets.com

KNOWLEDGE CENTER