Asynchronous Logic Use - Provisional, Cautious, and Limited -

Asynchronous Logic Use — Provisional, Cautious, and Limited


Prudence seems to be the watchword for adopting asynchronous design practices. Disregarding the true-believers at the edges of any debate pro and con, the majority of your responses to my recent column on asynchronous logic fall into the middle ground: acceptance in a provisional, cautious and limited way.

Some of you have used it in very targeted, specific ways, keeping in mind its limitations. Others, after looking at it have found it too much of a hassle to implement, but are keeping an open mind, hoping that some implementation will emerge that would be useful.

In one typical response, Thomas A. Visel at Neuric Technologies, Inc., Chapel Hill, NC, said, “You raise some interesting points in the async versus sync question, and I think you have some good ones. I am open-minded but cautious.”

He was an early advocate of asynchronous logic, having picked up on the technique in the push-forward timing design used in the old Illiac II when he was at the University of Illinois. From that introduction, he said, “Much of my logic was asynchronous.”

As he moved into larger and larger FPGA designs using asynchronous logic, he quite consistently came to face the matters of design debug time and the ugly design patches. “Patches – especially when a design has more than a few – usually indicate incorrect architecture,” he said. “Moving to a purely synchronous methodology rapidly eliminated the patches and considerably reduced the remaining debug time.”

Based on such experiences, he said, it is difficult for him to consider use of an asynchronous design methodology. “Simply put, when you find a better way of design, you are hard put to justify an approach that personal experience has shown to be problematic,” he said. “On the flip side, the millions of simultaneous transitions in synchronous logic begs for a better way, and that may well be asynchronous logic. I suspect that if the final solution is asynchronous, it will be driven by a well-defined design methodology and by CA tools that enforce the methodology.”

But, he warns, if the tools don't both enforce good practice and bring better capability to the table, forget asynchronous design. “I have been bitten in the past, but I don't know everything and am open,” he said. “A lot of bright lads are being given new ideas, and some will catch hold in time”

Less conservative, but still relatively cautious, is Ad Peeters, senior scientist and project leader of the Tangram Handshake Technology effort at Philips Research in Eindhoven, the Netherlands. While it is not clear to him whether the methodology will be useful in high performance CPU design, he said, it is obvious that there is now sufficient evidence that self-timed circuits do have advantages in the area of medium-complexity ICs in the area of connectivity, secure identification, and possibly automotive.

“The technology, design method, and tool set that we have developed over the years is now being applied in several product groups at Philips Semiconductors,” he said. “This has led to several successful market introductions of ICs that are completely asynchronous.”

The technology advantages that he and his fellow designers have been able to exploit are in the area of reduced power consumption, reduced current peaks, and reduced electromagnetic emission. “The challenge for us typically is to translate the potential advantages of self-timed (hand-shaking) circuit operation into an advantage for an end product,' he said. “The low-power advantage, as an example, can be exploited to the extreme in situations where exceptions have to be monitored and handled. A clocked solution will have to set the clock-speed to the required reaction speed, whereas a self-timed solution can simply wait until the exception happens and then react immediately.”

Also, where a clocked circuit will then consume power at the required reaction rate, a self-timed circuit, by comparison, does so only at the much lower exception rate (near-zero). Another low-power option is the addition of digital functionality to an analog IC without impacting the power budget, or even without requiring a clock pin.

While asynchronous logic may be a question mark as far as advanced high speed, extremely high density designs are concerned, Peeters is convinced that it has a definite future in designs in the new computing environment we are in albeit in areas that are not as sexy as the multi-billion transistors, high-performance CPU design areas.

“Rather, today's playing field may be in areas where power consumption is critical, and where end-users may not even be aware of the silicon used,” he said.

The views of these engineers and others I have talked to more or less reflect my own. Asynchronous logic has much going for it as a way to solve many of the problems facing us as we come up against a wall of inoperability.

There are many problems and significant investments that will be required to establish asynchronous logic, not as a replacement, but as a complementary methodology on an equal footing, with the same degree of effort put into education, research and tools development.

One argument that is always raised is that the cost of retooling, new tools and new processes to support the use of asynchronous logic is not a financial burden that the semiconductor and electronics industry can support. But how much is it costing the industry to move to all sorts of advanced processing and design methodologies to support and paper over the limitations of current synchronous logic designs.

For example, in order to maintain the global clocking signals across system-on-chip designs consisting of many millions of transistors, how much is it costing the industry to move from aluminum to lower resistance copper interconnect? Or by making modifications to the underlying dielectric constant of silicon? Both were as far out on the edge of acceptability a few years ago as asynchronous logic is now.

And how expensive is it for companies such as AMD, IBM, HP, Intel and others to shift to advanced silicon-on-insulator processes to reduce power dissipation and consumption to reasonable levels on high performance designs? Not even factored in here is the significant development investment that has gone into incorporating every power conserving circuit design engineers can think of.

Peeters may be right that the appropriate place for asynchronous design methodologies is not in the leading edge high performance circuits that seem to still be front and center in most semiconductor vendors minds. If asynchronous logic will have most benefits in those designs which do not necessarily push clock rate to the limit, nor circuit density, doesn't that make it ideal for what is emerging as the mainstream of the new netcentric computing environment?

Functionality, reliability, low power and security are what most traditional embedded designers as well as developers of the new breed of net-centric personal computing devices are asking for. True, they want performance, but not if it is more expensive and is done at the cost of the first four requirements.

I don't know how many times I have been told by embedded designers that all the performance they want is that which the specifics of the design require and what the application may need in the near future. More performance in small footprint designs than the application requires is overkill. And the same is true for the many of the iappliance devices in the mainstream of the new netcentric computing environment.

To me, that makes asynchronous logic the right technology at the right time. If not, it will be something that looks like it, tastes like it and feels like it, but does not have the unpleasant smell that some seem to associate with it right now.

Bernard Cole is the managing editor for embedded design and net-centric computing at EE Times and the editor of iApplianceweb. He welcomes contact. You can reach him at or 928-525-9087.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.