Contracts put the "I do" in code

March 01, 2007

JackGanssle-March 01, 2007

Languages should include explicit constructs that make it easy to validate the hidden and obvious assumptions that form the background to our code.

It's hard to remember life before the web. Standing in lines to buy stuff and ordering from out-of-date printed catalogs were frustrations everyone accepted. But today many of those problems are long gone, replaced by sophisticated software that accepts orders online. For instance, check out the result of an order I tried to place with Defender last year in Figure 1. I was tempted to complete the transaction to see my credit card burst into flames.

View the full-size image

At least they didn't charge tax, though it would be nearly enough to pay off the national debt.

Thirty-five years ago, as a student in a college Fortran class, the instructors admonished us to check our results. Test the goesintas and goesoutas in every subroutine. Clearly, Defender's clever programmers never got this lesson. And, in most of the embedded code I read today few of us get it either. We assume the results will be perfect.

But they're often not. Ariane 5's maiden launch in 1996 failed spectacularly, at the cost of half a billion dollars, due to a minor software error: an overflow when converting a 64-bit float to a 16-bit integer. The code inexplicably didn't do any sort of range checking on a few critical variables.

Every function and object is saddled with baggage, assumptions we make about arguments and the state of the system. Those assumptions range from the ontological nature of computers (the power is on) to legal variable ranges (total for consumer transaction < $1,000,000.00) to subtle dependency issues (interrupts off). Violate one of these assumptions and Bad Things will happen.

Why should anything be right? Isn't it possible that through some subtle error propagated through layers of inscrutable computation a negative number gets passed to the square-root approximation? If one reuses a function written long ago by some dearly departed developer it's all-too-easy to mistakenly issue the call with an incorrect argument or to make some equally erroneous assumption about the range of values that can be returned. Consider:

float solve_post(float a, float b, float c)
  float result;
  result=(-b + sqrt(b*b -
  return result;

which computes the positive solution of a quadratic equation. I compiled it and a driver under Visual Studio, with the results shown in Figure 2.

View the full-size image
The code runs happily even as it computes an impossible result, as is C's wont. The division by zero that results when a is zero creates neither an exception nor a signal to the user that something is seriously, maybe even dangerously, amiss.

I can think of three ways this tiny snippet of code can fail.

Consider the square root: we know that a negative argument isn't allowed. But that's not really true; that's an implicit assumption, which may not be correct. In the complex domain different assumptions apply.

Bertrand Meyer thinks languages should include explicit constructs that make it easy to validate the hidden and obvious assumptions that form a background fabric to our software. His must-read Object-Oriented Software Construction (Prentice-Hall PTR, 2nd Ed. 2000) describes Eiffel, which supports Design By Contract (DBC, and a trademark of Eiffel Software). This work, a massive tome at 1,200 pages that's brilliantly written but sometimes pedantic, is the best extant book about DBC.

In important human affairs, from marriage to divorce to business relationships, we use contracts to clearly specify each party's obligations. The response "I do" to "Do you promise to love and to hold" is a contract, an agreement between the parties about mutual expectations. Break a provision of that or any other contract and lawyers will take action to mediate the dispute or to at least enrich themselves. Meyer believes software components need the same sort of formal and binding specification of relationships.

DBC asks us to explicitly specify these contractual relationships. The compiler will generate additional code to check them at run time. That immediately rules the technique out for small, resource-constrained systems. But today, since a lot of embedded systems run with quite a bit of processor and memory headroom, it's not unreasonable to pay a small penalty to get significant benefits. If one is squeezed for resources, DBC is one of the last things to remove. What's the point of getting the wrong answer quickly?

And those benefits are substantial:
  • The code is more reliable because arguments and assumptions are clearly specified in writing.
  • The contracts are checked every time the function is called so errors can't creep in.
  • It's easier to reuse code since the contracts clearly specify behaviors; one doesn't have to reverse engineer the code.
  • The contracts are a form of documentation that always stays in-sync with the code itself. Any sort of documentation drift between the contracts and the code will immediately cause an error.
  • It's easier to write tests since the contracts specify boundary conditions.
  • Function designs are clearer, since the contracts state, very clearly, the obligations of the relationship between caller and callee.
  • Debugging is easier since any contractual violation immediately tosses an exception, rather than propagating an error through many nesting levels.
  • Maintenance is easier since the developers, who may not be intimately familiar with all of the routines, get a clear view of the limitations of the code.
  • Peace of mind. If anyone misuses a routine, he'll know about it immediately.

Software engineering is topsy-turvy. In all other engineering disciplines it's easy to add something to increase operating margins. Beef up a strut to take unanticipated loads or add a weak link to control and safely manage failure. Use a larger resistor to handle more watts or a fuse to keep a surge from destroying the system. But in software a single bit error can cause the system to completely fail. DBC is like adding fuses to the code. The system is going to crash; it's better to do so early and seed debugging breadcrumbs than fail in a manner that no one can subsequently understand.

Let's look at a common process that occurs in many households. How does one get a teenager up for school? There are three main steps, though some may be repeated N times, where N may be a very large number:

  • Wake up the kid.
  • Wake up the kid again.
  • Scream at the kid.

If we were implementing these in software using DBC, the code might look something like that in Listing 1.

View the full-size image

Each of the three routines has dependencies called "preconditions." A precondition must be satisfied before the function can execute. In Wake_kid if the precondition Kid_is_in_bed is not true, an error is thrown, the cops get called, and dad's ulcer acts up.

Scream_at_kid has a "postcondition," something that must be true when the routine exits. As long as the kid isn't conscious, the actor continues to scream.

All of the routines have "invariants," conditions that must be true both before the function starts and when it exits. Invariants are supra-functional aspects of system behavior. In this case if the coffee machine isn't on, life is barely worth living so an exception is trapped. An invariant could be included as part of the pre- and postcondition expressions, but repetition is fraught with errors. And, by specifically indicating that the condition is invariant, one makes a strong statement about that condition. Invariants are tested before the precondition check and after the postcondition.

Precondition faults signals a problem in the caller; postcondition problems mean the function failed.

A function (or class) is correct only if it satisfies the invariants and pre- and postconditions. These are formal contracts that the client and supplier make. In defensive programming one learns to check input arguments, for instance. Contracts are the meta-pattern of argument checking, as they ensure the inputs, outputs, and stable, unchanging aspects are all correct.

Here's another example. When I do arithmetic by hand I usually end with a postcondition. To subtract 1.973 from 2.054 I'd do this:
Long experience has taught me that my manual calculations are sometimes wrong, and a quick check costs little. That extra step of adding the result to the subtrahend is nothing more than a postcondition. In C:
float subtract(minuend, 
  float temp;
  postcondition: minuend ==
  return temp;

(Assuming there were some construct like "postcondition:" in C.) Clearly, this test guarantees the code is correct. Yup, it doubles the function's workload. But this is a trivial example; we'll see more realistic ones later.

How is this different from a C-style assertion? An assertion expresses the developer's belief about correctness at one point in time, at the beginning, end, or somewhere in the middle of the routine. DBC's constructs, instead, state things that are Truth. They're basic truisms about the structure of the system that stay carved in stone for longer than the scope of a single function. Though this may seem a minor philosophical point, I hope you'll see the distinction as this discussion unfolds.

Yes, it's possible to implement these ideas using assertions. But the concept transcends the limited single-node debugging zeitgeist of an assertion. Next month we'll look at syntax that goes beyond anything possible with a C-style assert.

It's worth knowing a little about Eiffel even though the language has essentially zero market share in the embedded space, as Eiffel's built-in DBC support has influenced the language people use when discussing Design By Contract. The keyword "require" means a precondition. "Ensure" indicates a postcondition, and, unsurprisingly, "invariant" means just what you think it means.

Good contracts
Contracts are not for detecting erroneous inputs. For instance, you wouldn't use a contract to flag a typo from a keyboard, but you may use one if the keyboard bursts into flames. Good contracts simply supervise relationships between the function and its caller. "I, the function, promise to fulfill these conditions." "And I, the caller, promise to fulfill these." Then the entire program can erupt in happy mazel tovs.

Software can fail due to outright bugs--programming mistakes--and exceptional conditions (events that are unwanted but not unexpected). Since there's usually no recovery from the former it's wise to trap those with contracts. The latter covers events like missed data packets, bad input, and noisy analog, which should all be handled by code that takes some remedial action.

Good contracts supervise software-to-software relationships, not software to users.

Good contracts check things that seem obvious. Though any idiot knows that you can't pull from an empty queue, that idiot may be you months down the road when, in a panic to ship, that last minute change indeed does assume a "get()" is always OK.

Good contracts have no effects. In C, if the contract has an "=" there's something wrong as the state of the system gets changed each time the code tests the contract.

If you don't know what a function should do in an abnormal situation, explicitly exclude that condition using a precondition.

Good contracts have invariants for anything that's structurally important to the function's behavior that may not change during the execution of the function. For instance, if a function relies on a global variable remaining unchanged, write an invariant for that.

Put contracts on things that seem obvious, as months or years later a change in the code or an unexpected hardware modification may create error conditions. If the scale's hardware simply can't weigh anything over 100 grams, define a contract to ensure that assumption never changes.

Stay tuned
Alas, I've run out of room and there's so much more to say. More next month. If you're not using Eiffel, don't despair. I'll show how to use DBC's powerful ideas even in C.

Thanks much to Geoff Patch in Australia for his insight and correspondence on this subject.

Jack Ganssle ( is a lecturer and consultant specializing in embedded systems' development issues. For more information about Jack click here .

Loading comments...

Parts Search