Debug: More return for less clicks - Embedded.com

Debug: More return for less clicks

The tremendous growth of the semiconductor industry over the past 40 years is in part attributed to advancements of the EDA industry that caters to chip design companies. Although most design steps have been automated, a significant aspect that still remains primitive is that of register transfer level (RTL) debugging. Since the verification complexity is expected to increase as much as 675 percent by 2015, according to Dr. Aart de Geus of Synopsys, so will the manual debugging effort.

Design debugging will continue to add tremendous costs, risks and jeopardy to the electronics design industry that faces shrinking time-to-market deadlines. Recent articles and surveys report that RTL debugging takes more than 50 percent of the verification effort, due in large measure to the increasing size and complexity of the designs. Another factor is that engineers may not readily possess all the knowledge needed to quickly debug problems. For instance, some develop expertise in specific blocks, while others become more familiar with a broader but less detailed view of the design. Furthermore, engineers must often work with unfamiliar code or with third-party intellectual property (IP), a challenging task indeed.

Today, verification has become the major bottleneck in design closure, a task that is not complete until the bugs are removed and the design is correct. Inevitably, the tedious debugging step has become a core part in this process.

Significant technology advances in verification over the past decade have targeted the discovery of bugs. For example, constrained random stimulus generation and intelligent testbenches are more efficient with verification methodologies such as the Universal Verification Methodology (UVM) and the use of SystemVerilog. A wide offering of  linting, clock domain crossing (CDC), property checking and other advanced verification tools have also improved the efficiency in discovering bugs. And yet, once the existence of a bug is confirmed, the verification engineer has little automation at his or her disposal to aid the localization of the root-cause of failures.

Most existing debugging methodologies are based on manually driven, primitive GUI solutions that package together waveform viewers, source code navigation tools, break-point insertion and basic what-if procedures. All these require expert knowledge of the design to manually drive the tools, a practice that results in a disproportionally enormous time and effort by the user.

In essence, up until recently, the state of affairs in debugging resembled that of logic synthesis in the 1970s and early 1980s. In those days, design was performed by the human with GUI editors who placed gate after gate manually until the specification was relentlessly implemented. At that time, it was evident that automation was the only viable avenue to retain the productivity pace and control the costs.

As history has shown, automation is the only exit to the vast debugging labyrinth today, so we allow engineers to spend time doing what they do best. And, this is certainly not the repetitive click of a mouse button.

With more than 20 years of research, debugging solutions that automatically localize the errors and present them to the engineer have become a reality. Less clicks, more automation and better use of the engineering resources. It is about time!

About the author
Dr. Andreas Veneris is president and CEO of Vennsa Technologies (Toronto, Ontario, Canada), a leading authority on circuit debugging and verification.

As a professor at the University of Toronto, he has published more than 70 papers on debugging and delivered specialized in-house tools to many semiconductor companies.

Previously, he was a visiting faculty member at the University of Illinois at Urbana-Champaign, where he also obtained his Ph.D.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.