This “Product How-To” article focuses how to use a certain product in an embedded system and is written by a company representative.
FPGA designers today face unprecedented challenges debugging their designs. FPGAs with four million equivalent gates are commonplace and their sizes are increasing quickly.
Creating designs this large is difficult enough, but debugging them is an even greater challenge. The opportunities for bugs to arise grow exponentially with size, since there are so many more combinations that could go wrong. At the same time, we are seeing an increased use of FPGAs in end-user products of many kinds.
Debug time is often the gating factor in determining when such products reach the market and determining not just how much profitability is realized, but in some cases whether there is a profit.
In the past, designers debugged their FPGAs by plugging them onto a board and then analyzing them with probes and logic analyzers. FPGA vendors currently offer tools that make it somewhat easier to probe internal design signals inside the FPGA, much like a logic analyzer, but there are limitations and usage issues that make this very difficult for use with very large FPGAs.
Not only do these traditional approaches take considerable manual time and effort, but they suffer from limitations such as pin availability and usable memory dictated by the available FPGA memory.
More problematic however, is the challenge associated with synthesis, where familiar RTL names are transformed into gates with unfamiliar names that the designers must decipher in order to track down bugs.
With rapidly advancing FPGA complexity, growing time-to-market pressures and more engineering resource constraints, verification engineers can simply no longer tolerate the limitations of these outdated techniques. FPGA verification engineers must adopt debugging methodologies that support today's multi-million gate FPGAs.
They require more ASIC-like debugging tools and methodologies. In addition FPGA designers must take a proactive approach that heads off rising FPGA complexity, instead of simply reacting to it. These pressing requirements have given rise to a new generation of FPGA debugging tools and methodologies, which are already having a significant impact on the FPGA design process.
FPGA Debugging in RTL
To elevate productivity, FPGA debuggers need to stop working at the gate level. Just as C programmers use C for debugging their code instead of the assembly language code it produces, FPGA designers should use RTL for debugging their FPGA designs and not the gate-level description generated by synthesis.
Verilog, System Verilog and VHDL are the standard, preferred environments for designing FPGAs because of all the high-level features and functions these languages provide for simplifying the design task. The same features and functions greatly simplify FPGA debugging as well and should therefore be used for the debug process.
But RTL simulators by themselves are far too slow for debugging large FPGAs, especially ones that require real-world stimulus for things like video and imaging applications. Already filling this gap are specialized tools, which complement RTL simulators as needed.
One such tool is the Identify RTL Debugger, which is part of the Synplify Premier FPGA synthesis product portfolio from Synopsys (for the purpose of this article the specific technical characteristics of a leading-edge, integrated FPGA synthesis product portfolio will be illustrated by the Synplify Premier portfolio).
Debug using an operating FPGA is orders of magnitude faster than software simulators. The Identify tool allows debugging teams to annotate the signals and conditions they want to monitor directly into their RTL code and then run synthesis and place-and-route to implement the FPGA device.
Once the FPGA has been programmed, the tool then allows users to view actual signal values directly in the RTL code and debug the live FPGA in-system running at full operating speed. Using a method such as this, design teams are far more productive and the end product employing the FPGA reaches the market much sooner.
When Full Visibility is Needed
Certain FPGA designs require even more in the way of automated tools to aid in debugging, especially those that are plagued by elusive bugs caused by intermittencies or extremely rare combinations of inputs.
In such cases the debugging environment must provide full visibility into the state of the design at the moment the error condition occurs along with a complete record of the stimuli recently applied to the design. The TotalRecall technology which may be added to the Synplify Premier FPGA synthesis tool is one example of how such an advanced debugging environment can work.
As shown in Figure 1, below , TotalRecall operates by creating a replica of a design or module using FPGA resources. This replica is automatically created by Synplify Premier along with the design itself during synthesis.
In addition to the replica, a buffer is created to store many cycles worth of the stimuli applied to the design. The original design is instrumented to detect various error conditions and to send trigger signals to the replica when such errors occur.
|Figure 1: The key concept underlying TotalRecall Technology is to replicate logic and insert a large buffer inside the device.|
The FPGA is tested by running both the original design and the replica at full speed, while buffering up inputs in a first-in, first-out stack. The moment an error occurs, the design triggers the replica and stimulus buffer to freeze, while the original design continues to run at full speed.
The replica now contains the exact state of the design at a time before the trigger condition/error occurred in the original design. The user can specify the depth of the buffer, which determines how many clock cycles before the error is needed.
This design state of the replica is then automatically sent to a simulator such as VCS, along with the stimulus buffer containing the set of inputs that led up to the error. Users then run the testbench through the copy of the design, utilizing the full analysis and debug features of the simulator.
When the error recurs, they use all the simulator's powerful features to research the problem and fix the design. Other features of this technology include block-at-a-time debugging to conserve FPGA space and the ability to access a high-performance ASIC prototyping system, which is part of the Synplify Premier portfolio, when even more memory is needed for stimulus buffering.
This comprehensive approach cuts greatly into the time it takes to pinpoint the cause of the particularly vexing bugs caused by intermittencies. It takes advantage of FPGAs' speed for most of the debugging effort and transfers the task to a simulator when full, detailed analysis is required. In essence, TotalRecall technology can be thought of as a “fast-forward button” for simulators.
Fast Design Spin Turnaround
Debugging an FPGA is an iterative process requiring multiple synthesis, placement and routing runs. Overall behavior is usually debugged first, followed by detailed timing and performance analysis.
During the early iterations, users care less about quality of results and are more concerned with turnaround time in order to wring out their RTL code. They may have many “What if?” scenarios to quickly explore before settling on a particular approach to refine. But later, in the final stages of debugging, attention to detailed design optimization is imperative.
Advanced FPGA synthesis environments should have provisions to accommodate both stages. For early stage debugging Synplify Premier for instance provides a “fast synthesis” mode in which some optimization steps are turned off, thereby speeding up synthesis runtimes by up to a factor of two.
When performance becomes the most important criterion in the later stages, the user turns off fast synthesis to take advantage of each optimization feature of the tool. The end result is top quality designs that move more quickly to market.
Switching Activity Reporting and Analysis
Minimizing dynamic power consumption is vitally important to many of today's FPGA designs, and a great deal of the debugging effort can be spent trying to keep it to a minimum.
Capable power analysis tools are available from vendors such as Altera and Xilinx to identify and help eliminate hotspots caused by excessive power switching. However, to make use of these tools good quality switching activity information is required—the de facto standard for communicating switching activity between tools is the Switching Activity Interchange Format (SAIF).
Switching activity can be created from simulation, but this requires a full test bench that can be time consuming to create and may not be available early on in the design cycle.
An ideal approach to address this dilemma is to get good quality switching activity earlier in the process, without the need for simulation. The Activity Analysis feature in Synplify Premier illustrates a strategy to achieve this that simplifies and shortens the SAIF file generation process (Figure 2, below ), while eliminating the need to create a full test bench.
Activity Analysis operates by sending the post-synthesis netlist and analysis design constraints back into the synthesis environment for creation of the SAIF file, which contains logic state and switching transition data for every net in the design. The user then runs place and route, sends the results into a power analysis tool, and then modifies the design to eliminate hotspots that the tool identifies.
|Figure 2: Activity Analysis Design Flow|
Integration Throughout Simulation and Synthesis
Time-stressed debugging teams can ill afford to spend excessive time setting up simulation and FPGA synthesis runs or manually invoking tool features. They require tight integration among all the applications in the EDA suite they use. As a result, a loosely organized mix and match tool approach represents a fundamentally flawed strategy.
Not only must designers access multiple leading-edge technologies and tools, but the challenges of debugging today's leading-edge FPGAs necessitate that these tools and product features be tightly integrated together within a single comprehensive solution. Such integration can provide added benefits, as in the case of Synplify Premier where the requirement for RTL simulation can be met by automatically summoning a VCS run.
Debugging FPGAs will only grow more arduous in the coming years, and their prevalence as integral components of products will keep increasing. Million LUT FPGAs are on the horizon, and consumer demands for packing them with more and more functionality will rise. These trends will render already antiquated gate-level debugging techniques totally obsolete. Only with more advanced debugging tools will we be able to meet next-generation time-to-market demands.
As FPGAs grow more capable, they will increasingly replace ASIC devices for certain applications where bleeding-edge performance or extremely large volumes are not required.
FPGAs have traditionally been, and continue to be, strong in communications and military/aerospace markets, but they are moving quickly into high-end consumer, broadcasting, medical, industrial, security/surveillance and automotive segments.
One of the biggest benefits of using FPGAs in electronics products is that they decrease time to market. FPGA design and debug tools must work to preserve this time-to-market benefit, as FPGA complexity grows to that of most ASIC devices of only a few years ago. Fast synthesis, physical synthesis, power analysis, team design and advanced RT-Level debug tools are already requirements for today's most advanced FPGAs.
Jeff Garrison is Director of Marketing, FPGA Implementation at Synopsys, where his responsibilities include product strategy, definition, and launch for Synopsys' FPGA products, including Synplify, Synplify Pro, Synplify Premier, Identify and HDL Analyst. Mr. Garrison holds a bachelors degree in computer science from Indiana University.