Slaying the Zombie of Embedded Design - Embedded.com

Slaying the Zombie of Embedded Design

We'd all like to believe we control our own destinies, but let's face it: We are surrounded by embedded systems products that control everything from our cell phones and PDAs to our TV and refrigerators. It almost sounds like a sinister Hollywood plot, like “Night of the Living Dead” with flesh-eating cell phones and pocket PCs. But these embedded systems serve such useful functions that many of us would be lost without them. It's easy, then, to understand the importance of verifying that the systems' embedded hardware and software will work as intended. And, as we head into 2006, it's become a priority for embedded design teams.

A big trend to watch in 2006 is the increasing use of hardware/software co-verification—that is, verifying both the hardware and software simultaneously—no insignificant feat. That's because past functional verification tools were too slow to run enough software to verify anything practical. Recent performance breakthroughs in hardware-based verification tools, however, have made it not only practical, but even affordable.

One of the biggest factors that determines the success of a new embedded hardware platform is the amount of software support it has behind it once it hits the market. The earlier that software development can begin, the more likely the embedded hardware will succeed. But programming software for embedded design can take much longer than the chip design itself. With time-to-market pressures at their extremes, design teams can't afford to wait for working silicon before plunging into software development. The goal for 2006, then, is for the embedded hardware design teams to have prototypes for software development working in advance of silicon.

To accomplish true hardware/software co-verification, a major methodology shift will be necessary. Hardware and software groups have traditionally been managed separately and worked in isolation, using different methodologies. Hardware designers use one language, either Verilog or VHDL, to design at the Register Transfer Level (RTL), and another set of tools and languages to verify that the chip functions correctly, such as e, Vera, or the more recent Property Specification Language (PSL).

Conversely, software designers use different sets of languages and development tools, and often verify their code using an Instruction Set Simulator (ISS) or FPGA prototype. Unfortunately, the software developers' prototype use is not cycle accurate with respect to the hardware designers' RTL design, meaning there are often functional differences between them. In other words, the software developers' prototype often behaves differently than the actual working silicon. That means that hardware and software bugs go undetected and that software written for the inaccurate prototype may not work on the actual chip after manufacture. What will be needed in 2006 are verification methodologies that provide hardware cycle accuracy during software development and, at the same time, provide a common debug environment to more quickly differentiate whether each bug lies in the software or in the hardware.

And yet another trend in 2006 relates to the sizes of the chip. As they have grown larger, it has gotten harder to build software prototypes using FPGAs, causing increased demand for off-the-shelf or turnkey solutions. Building FPGA prototypes is a difficult undertaking because it's necessary to partition a chip design across multiple FPGAs. Multiplexing signals across FPGAs drastically reduces the speed at which the prototype can operate. Furthering the difficulty is the inefficient use of the fixed number of FPGA resources—particularly memory and arithmetic logic. Other factors include limited visibility of internal signals and the inability to extrapolate FPGA simulation results back to the original RTL. Other annoyances include modifying ASIC code for FPGA logic synthesis, complex board-level design with high-speed crosstalk considerations. Even worse, a common problem is underestimating gate count to the point where the prototype is too small to handle the entire ASIC, making it useless for any serious software development.

2006 holds great promise, but there are still lots of problems to be solved before embedded system design is a seamless process. These range from hardware/software co-verification to interface debugging, separate design methodologies and insufficient tools to get the job done. Consumer products based upon embedded systems may continue to gain more and more control over our everyday lives but embedded systems designers, armed with the latest tool improvements, should be able to slay their resurgent verification methodology zombies, regaining control over hardware/software verification integration.

About the Author
Dino Caporossi is the Vice President of Corporate Marketing for EVE (Emulation and Verification Engineering) in San Jose, CA.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.