C is a very expressive language. We can make working code that's ugly, pretty, silly, or just cute. That's not always a good idea, though. For example, take the following construct:
ptr = 0;
It's part of a putchar() function provided by the compiler vendor that I found this week while playing with a cool little 16-bit microcontroller. There were no comments, of course.
The initial while statement threw me for a minute. Why did it take an argument of 42? Obviously 42 is non-zero, so the code is nothing more than an infinite loop. But why 42? Was it a magic number? Was there some meaning to this that eluded me? Perhaps a bifocal-eluding small “i” prefixed the number, turning this into a simple variable.
Eventually the significance sank in. The mice in Douglas Adams' Hitchhiker's Guide to the Galaxy built a grand computer that pondered the question “what is the answer to life, the universe, and everything” for ten million years before answering: “42.” That answer meant as little to the mice as it does to us, but kept Zaphod Beeblebrox, Arthur Dent, and Ford Prefect tumbling into one misadventure after another. While Marvin the depressed robot whined about everything.
So, apparently the person who coded this while loop selected 42 to be cute. It's non-zero and will keep the loop running forever. But it tripped me up for a bit, so was a bit of cuteness that effectively reduced the code's readability.
The deluded programmer could have used a symbolic notation; it is, after all, generally a Bad Idea to embed numeric constants in the code. Maybe something like:
#define meaning_of_life_etc 42
buried in a header file somewhere. That's a lot more disciplined. If the meaning of life, the universe, and everything were to change a quick edit of the header file would fix the program everywhere.
But this is even foggier than the original version. Now we don't know anything about meaning_of_life_etc without searching through dozens of .h files. Is it a constant? Global? Does an ISR change it invisibly in the background?
This reminds me of the time my college assembly-language instructor just about blew a gasket when I submitted a card deck (this was a long time ago) that looked like:
Read input data1
Read input data2
Compute data1 times data2 to result
The assignment was to read two inputs, multiply them and print the result. In Univac 1108 assembly language, which of course this code doesn't resemble at all. But that machine had a very powerful macro preprocessor, so, just to be ornery I defined the above constructs as macros. Was it valid assembly? Maybe in some technical sense, but by getting very cute I submitted code that wasn't standard, wasn't maintainable, and that no other programmer could work on.
Some years ago a friend who owned a compiler company told me about an odd experience he had had. One day three burly guys draped in suitcoats showed up. In a Kafka-esque scene they demanded that the company remove a word used as a variable in a demo program supplied with the compiler but they refused to identify the word. It was classified, it seems, a code word denoting something really important to them but meaningless to my pal. In an odd dance the suits pointed vaguely at listings while looking away because Jeff wasn't allowed to know this secret. Finally, he figured it out and made the change.
It seems the secret had been revealed in a book about the NSA. “For fun” a young programmer used the word as a variable name. The agency didn't share in the developer's guffaws.
Software has to do two things: it's gotta work, of course. But just as importantly, the code must clearly convey its intentions to the next developer.
Those are basic coding rules. An extra bit of wisdom might be to avoid dispensing national secrets in the source.
And don't be cute.
Jack G. Ganssle is a lecturer and consultant on embedded development issues. He conducts seminars on embedded systems and helps companies with their embedded challenges. Contact him at . His website is .
I once worked with a nut that thought less source code lines meant better code. His source was so bad (compacted) it was about as useful as looking at a hex memory dump! Wacked. Takes all kinds I guess…
– Jack Crack
Actually 42 was the _answer_ to life, the universe, and everything.
From my faint recollections of the book (which itself is more or less a faint recollection of about 100LSD-induced stories)….
Upon being given the answer, the mice realised that they should have been computing the question as well as theanswer to to life, the universe, and everything. So they built another, bigger computer (actually it was theplanet earth) to compute this.
Of course, just before the question was to come out, the earth was destroyed to make way for a newinter-planetary expressway.
IMHO, these books (a trilogy in five parts) are absolutely necessary reading for any geek.
– James Morrison
.I actually knew an assembly language coder who used only lower case “l” (ell) and upper case “0” (zero) forstatament and variable labels, and kept his own scratch sheet with explanations. Since no-one else had accessto his notes, only he could maintain his code, which was, of course, his objective.
Somewhat similar is the experience of decoding object code, producing assembly source. I did this,infrequently, for the IBM 1401, where the object code was sort of mnemonic, anyway. The level of concentrationand effort was intense, probably one of the most difficult programming tasks I ever did. Leter, the DEC PDP-8was decoded in a similar fashion.
None recently, but your examples of “cute” in “C” brought back the memory.
– Tom Woods
I agree with your premise but sometimes we have to find outlets for the mirth. OK, obscure codereferences can be trouble. My personal outlet is pre-release product names. I am responsible for designingthe Programmable Highspeed Ramp Output Generator (PHROG) and for silkscreening a memory board (RAMBOARD) sothat it read as RAMBO. When no one had any ideas for a product name, I came up with an acronym HERPES, which Ithreatened to stick on the project unless someone came up with a better one. Not suprisingly, people weremotivated rather than being “one of the HERPES team”.
As a manager (a sometimes hat), I find semi-appropriate outlets for humor good for a team.
If someone in my group wanted to do a while (42), I would tell them to document it to be sure time wasn'twasted in the future, but would let it stand.
– Steve Nordhauser
I once reviewed some C code written by a co-worker. One routine was a function to update thewatchdog, where two values were written to a register to keep the watchdog from resetting the CPU. The functionwas named “Feed_The_Puppy” and the two constants were defined as “WATER” and “CHOW.” While I would clearly callthat cute, I don't think that it changed the readability of the code, for the better or worse. It did make mechuckle, though.
– Tom Bachmann
I had a similar experience. Some years ago I had to do some fiexs on an old project. There were someassembler sources written by some “imaginative” programmer – he used fruit names as labels…like call Pears,djnz r0,apples etc. Although it looked funny, it was not funny to figure out what the hell Pears do…And ofcourse, there were no comments in the source…
– Dejan Durdenic
Years ago, we bought some PCL5 code for a laser printer we were developing. One of the programmers who had written this code seemed to have taken perverse pride in being cute. Not only was the code obscure, but the comments consisted of Monty Python references which, if — and only if — you were an MP fanatic, actually made sense in a warped way. This programmer (I use this descripter quite loosely) had the initials GAD. Collectively, we called this code E-GAD code. Eventually, he was fired, but not before he did tremendous damage to his company's reputation.
– James Thayer
I once worked with a guy that created 2 macros in every module he wrote…
#define RepeatFrom goto
#define ExitTo goto
…which he employed liberally depending on if he wanted a backwards or forwards branch. His stated reason was that in a previous position it was strictly illegal to use goto statements.
– Kevin Kilzer
Sometimes it's the legacy environment that forces “cute” names. I was on a program once where due to legacy tools and policies that forced certain naming conventions, the name of the system had to be reflected in the variable and routine name(s). My system was “UFC”, which caused no end of fun in trying to figure out *printable* names…
– Angelo Keene
The first place I worked at 25 years ago we hired a software/hardware engineer. He started coding when there were no assemblers and no debug tools, so he had developed techniques to code to the point that he didn't do a lot of debugging. The code pretty much just worked when he wrote it.
He didn't use labels for jumps that were within the near range of the part – he would calculate the offset and write something like $+10, or $-25, the $ sign representing the current program counter. Anything that was a long jump or subroutine call was done with a label.
Now this looked impressive, and in truth this guy did understand what he was doing, but at the time we were doing this, the assemblers could handle as many labels as possible. So we had about half the code written with labels and half done his way.
Once I coded up a routine to handle a DMA system for accepting data from the outside world with his system – basically writing it on paper and then playing microprocessor on the desk. It took me about a week to write and debug it on my desk – when I entered it, it didn't work – because I left a line of code out. Once I found the error, the routine worked correctly. Now I figured that if I had done it “normally” (write and debug using the emulator) I probably would have used about the same amount of time, so I've stuck with that method to date. However it did impress upon me the difficulty of working with little or no tools for debug.
When I look at designs today, I always get the best tools available for debug, because they really reduce time to market on a project. One of the bad things with new micros is the lack of good emulators. There are things an emulator can do that these new BDA micros can't.
Certainly today there is little reason to see examples of wacky coding, yet we still do. There are a lot of guys out there who refuse to move beyond a certain point – the effort required to learn new methods can be exhausting, and the older you get the less juice you have to do it.
Most likely people will still be decrying this in another 25 years. It was a problem then….and it still is today.
– Tom Mazowiesky