"Successful Failures" Can Make a Comeback - Embedded.com

“Successful Failures” Can Make a Comeback

The late and not entirely unlamented Apple Newton is an example of what I like to call “successful failures”: companies, architectures and products that failed because of bad timing, bad execution, or bad design, but which had the core of an idea that eventually took over the industry.

Some just disappeared from the market. But many others, after making a grab for the brass ring of mainstream acceptance, settled for particular niches in the market. I can think of a dozen such examples. Here are a few I have come up with.

Although the Apple Newton has disappeared totally from the market as a commercial entity, the idea has not. In fact, a small coterie of enthusiasts still uses that precursor of the personal digital assistant (a term coined by Apple, I think). Many of its features and capabilities are a part of the definition of virtually every small footprint Internet-centric personal computing iappliance: PDAs, handheld computers, Web-enabled cell phones, and half a dozen variations on the theme. Although no longer supported by Apple, the Newton community has kept it current, in both hardware and software. Indeed, the Newton community may be more than just surviving. It may be growing, given the limitations of current offerings.

And who knows, with Apple Computer now focusing attention again on this segment, maybe the Newton will rise again as a fully supported, commercial product.

Another example, one from the realm of circuit and transistor device design, is the experience of bipolar and CMOS and their battle for share of mind and dollars. After a brief time on top of the world of integrated circuit design bipolar was slowly driven out of the mainstream of high-density digital design by CMOS.

One of the last gasps of bipolar logic as an LSI and VLSI candidate was integrated injection logic. But while isolated in niche in the market, admittedly a very large one, bipolar concepts and structures have worked their way into the mainstream of high density digital design as BiCMOS, used by advanced microprocessor builders to get as much performance out of their architectures as possible.

There is also the example of heterojunction transistor devices, such as gallium arsenide, which left silicon homojunction semiconductor devices gasping for air with their ability to achieve multi-gigahertz transistor switching clock rates. And while particular instantiations of this methodology have not made it into the mainstream, the concept has. It takes the form of heterojunction silicon-germanium transistors, which are becoming a key element in virtually every wireless consumer electronics device.

It was not necessarily the limits of the technology or the product that spelled the end of these attempts to achieve mainstream status. Just as important was the potential they raised and the competitive reaction to them. Seeing their turf threatened, proponents of competing technologies and products worked fiercely to undercut the advantages of the interloper by pulling every technological trick they could think of.

That, I suspect, may be the fate of asynchronous design methodologies. Some of you have begrudgingly admitted to me that, at least conceptually, it is an elegant solution. However, the perceived complexities seem to outweigh the benefits in the minds of engineers I have talked to.

That synchronous design also has serious flaws you can see in the feverish efforts of companies to solve its inherent problems. But the are also doing it, I suspect, out of concern that they may have to give up the existing, well understood synchronous methodologies for something which in their eyes is even more problematic.

Already in company announcements and at technical conferences within recent weeks, you can see some very imaginative techniques being discussed. Indeed, at the recent 15th IEEE International ASIC/SoC Conference, many of the techniques discussed are only slight variations of those used in optimizing for performance.

For example, researchers from Kyushu University (Fukuoka, Japan) suggest starting with the dynamic and static power consumed in data paths. Instead of using accepted standard-width data paths, designers of SoCs for embedded and small footprint computing and consumer iappliances could instead determine the actual range of values associated with each variable in a design. This information could then be used to scale both the data path and memory widths to accommodate power consumption and dissipation concerns.

An investigator from the University of Manchester Institute of Science and Technology), suggests focusing on data cache operations because they consume much of the energy budgeted for a small footprint. A number of companies have come up with cache modifications to reach the same end. Motorola engineers in some have built into PowerPC designs the ability to vary the size of the cache to achieve the proper power budget.

And earlier this month, researchers at Intel revealed a fundamental shift in its microprocessor designs from instruction level parallelism to thread-level parallelism not only for performance but for power consumption and dissipation. By breaking down the pipeline into blocks that allow incremental increases and decreases not only in performance, but also in power consumption. This allows a processor to be speeded up, or slowed down, dynamically, as the application requires it.

If asynchronous design does not emerge successful in the war of ideas and is only applied in certain market segments, it will nonetheless be one of the more successful of the “failures.” If the fear of the complexities they might face is even a very small spur to driving companies to squeeze more out of existing synchronous logic, we owe it to the staying power of this venerable technology that is fighting to remain a contender in this current war of ideas. Remember that Carver Mead and Lynn Conway devoted fully as much space to it as to synchronous logic in the seminal Introduction of VLSI Systems .

All of this reminds me of a quirky movie cartoon I saw once, titled, I think, “Frankenstein's Chocolate Cake.” In it, Frankenstein, a well-known pastry chef, creates a way to bake a monstrously delicious chocolate cake. He invites his competitors, the other pastry chefs in the village near his castle, to see his creation.

Realizing the threat to their livelihoods and to the traditional techniques for making chocolate cakes they rise up and burn the castle down.

But as the triumphant bakers march back into town, cheering and chanting that the monstrously delicious chocolate cake was destroyed, a voice far away in the basement of the burned castle mutters, as the cartoon ends: “Oh no I'm not.”

Bernard Cole is the managing editor for embedded design and net-centric computing at EE Times and the editor of iApplianceweb. He welcomes contact. You can reach him at or 928-525-9087.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.