Return to Karnaugh's isle - Embedded.com

Return to Karnaugh’s isle

Jack tackles Karnaugh maps again before meandering through flip-flops and winding up at ripple counters.

A few months ago, we were talking about symbolic logic, George Boole, and Karnaugh maps.1,2, 3 Because it's been awhile, we should probably review the bidding.

As you know, all programming languages (I think) support Boolean algebra. They have Boolean operators, which you can combine to write Boolean expressions. In C and C++, the operators are “&&”, “||”, and “!” (it also supports “&”, “|”, and “^”, but let's not go there).

Pascal uses the wordier, but certainly more easily remembered, AND , OR , and NOT .

Most programmers use those logical operators routinely but not necessarily to best effect. My whole purpose in discussing Boolean algebra was to encourage you to use the tools of symbolic logic to help you solve knotty programming problems. Sometimes, when the decision to do one action or another seems too confusing and convoluted to get your arms around, try writing down the Boolean algebra or applying a Karnaugh map. You might find that the logic simplifies right down.

That was the message—the whole message. Just an encouragement to add yet one more tool to your toolbox.

Of course, you can't apply a Karnaugh map if you don't know what it is or how to use it. And you can't reduce a logic equation if you don't know the rules of symbolic logic. I chose to teach you these things by describing some electronic circuits that implement logic functions. By then, we were hooked. I had a tiger by the tail. I'll spare you the whole of a thoroughly disgusting (but not dirty) joke, and offer just the punch line: “It was all in one string.”

I launched into a discussion of logic circuits, NAND and NOR gates, and flip flops. I waxed nostalgic about Don Lancaster and RTL logic and seven-segment displays.1,2, 3, 4

I was taken aback by the overwhelming response to my references to logic gates and flip-flops. I seem to have struck a nerve. This magazine may be called Embedded Systems Programming , but it's obvious you folks dabble in hardware as well.

Incidentally, I got tons of e-mail concerning Don Lancaster, whom I praised in my most recent column.4 Most of you wrote something like “I still have his CMOS Cookbook and use it frequently.” The name of the book changes—Don is a prolific writer—but the sentiment seems universal. I guess even I had never considered what a national—no, international—treasure Don is, or how much influence he has had on so many of us. Nice going, Don.


Figure 1: Seven segment decoding—Segment D

Before we get too far into this column, I need to give a couple of corrections. In Figure 6 of my January column, I gave the Karnaugh maps of the logic for driving a seven-segment display.3 I got the Karnaugh map for Segment d right, but the logical equation wrong. The map is shown here in Figure 1, and the equation is:


(Equation 1)

Sharp-eyed readers (and boy, are your eyes sharp!) might notice a slight change in the Karnaugh map. This time, I used the exclusive-OR operator, Å , to capture the two 1's in the second column.

There's a limit to how much time we can spend on hardware in this column. But if you're eager for more of the same, here are two excellent references for you. The first is a great web site, www.play-hookey.com/digital/basic_gates.

Webmaster Ken Bigelow has done a phenomenal job of explaining digital circuitry. He starts with the notion of simple logic gates, and goes on to explain the functioning of many more complex circuits, including counters, computing elements, and so on. Best of all, his site has live, Java-based examples of circuits for you to play with. Forget Electronics Workbench, or any of the other circuit simulators. Just go to Ken's site, tickle the inputs of his models, and watch them change state. It's outstanding stuff.

The other reference is Charles Petzold's incredibly good book called Code: The Hidden Language of Computer Hardware (Microsoft Press, 2000). Petzold starts with even more primitive circuits than I did. Figure 2 shows his initial circuit for an AND gate. I don't think it can get much simpler than this.


Figure 2: The ultimate AND gate

Having thus whetted our appetites, Petzold goes on to relay logic and beyond, giving a delightful tour through the innards of computers. I highly recommend both sources. Petzold even gives the Radio Shack part number for the relays to build your own computer. I still haven't decided whether he's serious or not, but if you like to tinker with hardware, following his instruction would be a great way to do it.

More on flip-flops
In my column “A Primer on Karnaugh Maps” in which I first introduced Karnaugh maps, we worked out the decoding needed to implement a decade counter. Drawing the maps and minimizing the logic was a fun exercise, but not a very practical one. In the real world, no one is likely to implement that logic, because in the real world, counters get built from flip-flops, not gates.


Figure 3: A simple flip-flop

That statement requires a little explanation, because all flip-flops are made from gates. Figure 3 shows a copy of the simple flip-flop I showed you a few months ago. I've changed the notation just a bit to match Ken Bigelow's model.

Here's how it works. The default state of the two inputs is high (logic 1). Note the bars over the symbols for the inputs. This tells us that they're active low.

When one of its inputs is held high, a NAND gate serves as an inverter. Whichever logic level the other input has, the output will be the opposite level. Suppose that the output of G1, which is Q, is currently low. Then its output feeds back into G2, which forces its output high. That output goes to G1, which only serves to hold G1 at logic level 0. The circuit has memory, and will stay in its current state until something else happens to change it.

Now suppose we drive G1's input to its active state, which is low. Because the other input is currently high, G1 inverts the input, switching its output to high. That output feeds back into G2, which goes low. The electrons go round and round, and in the end, the flip-flop's output, Q, has been set to logic level 1 (high). It's complementary output will be, of course, low.

This simple circuit is the basis for every bit of your computer's gigabyte or so of memory. It's also the basis for counters and any other circuit that must remember what happened last.

If we're going to use the flip-flop a lot, it makes sense to assign a symbol to it, rather than redrawing Figure 3 every time. The symbol is shown in Figure 4a. It's called a set-reset flip-flop, an R-S flip-flop, or sometimes simply a latch. The latch will remember what state it was last put into, but it does everything asynchronously. You're not likely to find this particular latch in any chip catalog, because it's simply too limited in its behavior.

Recall that the inputs to the latch are active low. We must hold both inputs high except when we're trying to change the state of the flip-flop. Then we must take one and only one input low.

What happens if both inputs are low at the same time? A disaster. The NAND condition is not satisfied by either gate, so both outputs of the circuit go high. This is no good, because there's a sort of understanding—a gentleman's agreement, if you will—that Q and must always be complementary. If both inputs are low at the same time, the last input to go high, wins. The final state of the flip-flop depends on how shaky our fingers were on the set-reset buttons.

Worse yet, the inputs can go through intermediate states that make no sense. In a perfect world, all gates work exactly as they're supposed to, and their transitions are very predictable. But we don't live in a perfect world. The speed of light is finite, and the gate delay of each gate will be slightly different from the others, so it's easy to see very brief transients on the output lines. This is the source of the dreaded “glitch.”

Clocked logic
We can civilize the R-S flip-flop in a couple of ways. One way is to add an enable pin, The idea is that circuitry on the enable pin blocks the inputs from getting to the flip-flop proper. You can change the inputs all you like, you can have them twitching and glitching, but the state of the flip-flop stays serenely unperturbed, until the enable is asserted.

An even better solution is to make the extra pin a clock instead of an enable. The difference is, where the enabled flip-flop will respond to input changes as long as the enable pin is asserted, the clocked variant only responds at a specific instant of time, which is when the clock makes a transition—usually from high to low. This means that the inputs can change anytime in the sequence, except during that brief time when the clock is in transition.


Figure 4: Flip-flops

Figure 4b shows a clocked R-S flip-flop. Note the difference in logic level on the inputs.

Clocked or unclocked, the R-S flip-flops both suffer from the weakness noted earlier, which is that the R and S inputs can never both be active at the same time. Clearly, this looks like a Murphy's law sort of problem. We can promise to be good, and never allow this condition to happen, but Murphy's law guarantees that it will, despite our best intentions.

The Type D flip-flop in Figure 4c represents one solution, in which we simply take away—or, rather, make invisible—one of the inputs. The R and S inputs aren't really gone, they've simply moved inside the chip, and are forced by the circuitry to always have different logic levels. The D input is active both high and low. Whatever value it has, the flip-flop will track it on the next clock cycle. If you have a string of type D flip-flops connected in a chain, a pulse that's either logic level high or low will propagate down the chain, one step per clock cycles. That's a digital delay line, and it has all kinds of neat uses.

I should mention that, despite the basic clocked nature of most flip-flops, they still retain the asynchronous set and clear inputs. These are necessary to initialize the flip-flops into known states, such as during power-on reset. I've seen people use the asynchronous pins as part of the logic, but I strongly advise against it.

The king of flip-flops, because of its versatility, is the J-K flip-flop shown in Figure 4d. In addition to the usual features—synchronous and asynchronous set and clear—the J-K flip-flop has one more delightful feature, which we'll get to a bit later. For now, let's just revisit the decade counter.

Table 1: A synchronous decade counter

ABCD abcd Action
0000 0001 Set D
0001 0010 Reset D, set C
0010 0011 Set D
0011 0100 Reset D, reset C, set B
0100 0101 Set D
0101 0110 Reset D, set C
0110 0111 Set D
0111 1000 Reset D, reset C, reset B, set A
1000 1001 Set D
1001 0000 Reset D, reset A

In the December issue, we just tacitly assumed that the outputs of the logic would go to some other place, suitable for recording the next count.2 But that's not really how counters work. In reality, the count is kept in the same four flip-flops, which are changed as needed each time the clock ticks. Table 1 shows the actions needed. Here, I've let ABCD represent the current state of the flip-flops (A is the high bit), and abcd the new state of the same flip-flops.


Figure 5: Karnaugh maps

The Karnaugh maps, shown in Figure 5, come next. These are going to look a bit different from others you've seen, because I want to show both set and reset actions on the same maps. Right off the bat, however, you can see in Table 1 that there's a distinct pattern to the actions, broken only by the transition from 1001 to 0000 at count 9. This is hardly surprising, since without this “anomaly,” we'd have a binary rather than decimal counter.

The actions to be taken are:


(Equation 2)

A circuit that implements this decade counter is shown in Figure 6.


Figure 6: Decade counter first cut

J-K's secret
We must admit, the logic of Figure 6 is pretty busy. Even the connections for D, which does nothing but toggle between 0 and 1, is busy, with the cross-connections you see in the figure. The circuit requires five AND gates, one of them a 3-input gate. And this circuit is, indeed, the circuit you'd have to use if you were going to use clocked R-S flip-flops.

Fortunately, there's another kind.

Recall that, in the circuit of Figure 3 (or 4a), we're not allowed to assert both R and S inputs at the same time. They must be maintained at logic 1 except when we are changing the state of the flip-flop. Then, one and only one of the inputs should go low.

To keep people from disobeying the rules, the designers of the Type D flip-flop simply took away one of the inputs. But the designers of the J-K flip-flop had a better idea. They asked themselves, “Instead of having the circuit fail if both inputs are asserted (active high, for the J-K case), why don't we just have it do something useful?” Perhaps looking at Figure 6, they realized that toggling—changing state regardless of its current value—is a useful behavior. So that's how J-K flip-flops behave. If only one of the inputs is asserted (driven high) at a time, the J-K flip-flop works just like the clocked R-S flip-flop. But if both are asserted together, the flip-flop will change state on the next clock cycle.


Figure 7: Logic for J-K flip-flops

Figures 7b through d show the Karnaugh maps for implementing the counter with J-K flip-flops. I've left out Figure 7a because it doesn't change; it's the same as Figure 5a.

Years ago, I adopted a convention that uses the asterisk character to stand for “toggle.” The figures use that convention where it's appropriate.


Figure 8: Decade counter with J-K flip-flops

Obviously, the change in Figure 7d is dramatic. We now need no logic at all to drive flip-flop D; it simply toggles at every clock cycle. The circuit derived from Figures 7 is shown in Figure 8. As you can see, it's quite a bit simpler.

Ripple counting
You might think that we've simplified the decade counter as much as possible, but you'd be wrong. The key to further simplification is to recognize that nobody ever told us that all the clock inputs have to come from the same clock. The flip-flop doesn't care who's driving its clock input; all it knows is, it's going to respond each time the clock goes low.

From the action table, and also from Figure 8, it's clear that the value of flip-flop D is key to what happens. Almost every input of every other flip-flop is ANDed with the Q output of flip-flop D. We can eliminate these gates by using D, rather than the external clock, to drive the other flip-flops.

Counters that use this kind of connection are called ripple counters. They're quite similar—in fact, the electronic equivalent of—the ripple carry of a computer adder. The value of flip-flop C only changes when D is going from 1 to zero. Connecting the output of flip-flip D, directly to the clock input of C, implements this behavior with no other logic.

Ripple counters are not always usable. For very long sets of bits, we might find that the summed gate delays are not acceptable. Most of the time, though, ripple counters work just fine. If it's possible to use the flip-flops in this manner, that's what we should do.


Figure 9: The ripple counter

The final version of the decade counter is shown in Figure 9. It's about as simple as we can make a decade counter to be.

Jack Crenshaw is a senior software engineer at Spectrum-Astro and the author of Math Toolkit for Real-Time Programming , from CMP Books. He holds a PhD in physics from Auburn University. E-mail him at .

Endnotes

  1. Crenshaw, Jack, “It's Only Logical,” Programmer's Toolbox, Embedded Systems Programming , November 2003, p.11.
  2. Crenshaw, Jack, “A Primer on Karnaugh Maps,” Programmer's Toolbox, Embedded Systems Programming , December 2003, p.11.
  3. Crenshaw, Jack, “Shedding Light on Seven-Segment Displays,” Programmer's Toolbox, Embedded Systems Programming , January 2004, p.9.
  4. Crenshaw, Jack, “A Root-Finding Algorithm,” Programmer's Toolbox, Embedded Systems Programming , May 2004, p.9.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.