Fall ESC08 Boston Preview: Turning hardware and software design upside down - Embedded.com

Fall ESC08 Boston Preview: Turning hardware and software design upside down

Jim Weyand is a hardware designer and Jason Andrews is a softwaredesigner, but they have two things in common. First, both of them areconducting classes at the Fall Embedded Systems Conference in Boston,Ma., this month. Second, the topics they have chosen turntraditional hardware and software design methodologies upside down.

In “Coverage Driven Verification (CDV) forembedded software (ESC-463),” Andrews describes how some ofthe basic principles of hardware verification for automation,throughput and scalability can be applied to software verification. Heuses a Linux device driver from a USB subsystem running a Linux kernel in a virtual machinehypervisor to show how device driververification can be done using CDV.

In “A Methodology for successful VHDL”basedFPGA design (ESC-462),” Weyand describes how to takemethodologies commonly used in software design for structured codedevelopment, documentation and organization and apply them to VHDL-based FPGA design.

Synching up hardware and softwareverificatio n
In his course on coverage driven verification for software, Andrews -an architect working in the areas of embedded software and hardwareverification for SoC designs at Cadence Design Systems – focuses mostof his attention on Linux device drivers, although such techniques canbe useful in many software environments, such as GUI testing andlibrary/API verification.

In Linux systems, he said, device drivers are normally tested byrunning them on the real hardware and observing the results. Sometechniques exist to help with driver debugging but there are none thathelp much with dynamic verification. Most drivers are tested using somekind of system “stress test” that attempts to strain the system and thedrivers to make sure they are stable under heavy load conditions.

“Although all of these and other tools and techniques are useful fortesting drivers, there are no tools for verification planning, stimulusgeneration, functional coverage collection, and automated andcoordinated stimulus for both the driver and the hardware it iscontrolling,” said Andrews, and is an area where the basic principlesused in hardware verification, as well as more advanced techniques forautomation, throughput, and scalability, can be applied usefully.

.”In the past, many hardware verification users were primarily companiesselling chips,” he said. “Today, these same companies cannot compete inthe market by providing a data sheet and device samples, but insteadmust develop chips, create evaluation platforms by putting chips onboards, and then provide all of the software to demonstrate a workingsystem to prospective customers.

This has lead to increased investment in software by chip designcompanies and, as a result, over the last few years, Coverage DrivenVerification has been working to fill this verification gap by enablingusers to better utilize software as part of the hardware verificationprocess in the familiar verification environments of Verilog, VHDL, andSystemC.

“In the future CDV will continue to focus on the chip customers whoare developing software using many execution engines for simulation andemulation,” he said. “There are also new opportunities for theapplication of Coverage Driven Verification technology in embeddedsoftware.”

Structuring your FPGA designprocess for success
In his class, Weyand ” a principal systems engineer at Embedded SystemsDesign, Inc. – describes how to use a well-defined and documentedVHDL-based FPGA design process that brakes up hardware developmentprocess into seven phases – requirements definition, preliminarydesign, detailed design, module level VHDL coding/simulation, top levelVHDL coding/simulation, synthesis, and implementation.

“In many hardware design organizations, the process for VHDL-basedFPGA design is poorly defined and often undocumented,” he said.”Individual designers tend to design FPGAs using their own methods thatare often inconsistent with those methods used by other designerswithin the same organization. “

More often than not, FPGA designs are produced without the reviewsnecessary to ensure critical risks are mitigated and without thenecessary documentation needed for integration and test. Managers andtechnical leads, he said, also suffer because they have littlevisibility into design progress and are unable to apply designresources efficiently.

“The negative effects of a poorly defined and undocumentedVHDL-based FPGA design process are significant,” said Weyand. “At thestart of an FPGA design project, the design process cannot beefficiently tailored to eliminate unnecessary steps and emphasizeothers. This often results in time wasted on unnecessary steps andincreased risk because more important steps aren't given enoughattention. “

After the FPGA design effort starts, serious design issues are oftendiscovered late in the design cycle when design changes have asubstantial cost and schedule impact. FPGA designs may not fit in thetargeted package, not meet the required timing, or consume too muchpower.

The resolution of these issues, said Weyand, usually requires therepetition of FPGA design process steps and sometimes evenmodifications to board designs or system requirements. “When the FPGAdesign is completed, the resulting documentation available forintegration and test is often limited and inconsistent,” he said. “Inmany cases the only useful FPGA documentation is comments providedwithin the VHDL code. Other designers using the design are often forcedto 'reverse-engineer' the FPGA functionality from the VHDL code.”

In addition to negative impacts to the current FPGA design, Wyandsaid the lack of a structured design prevents the efficient reuse ofcommon VHDL modules by other FPGA designers. Common, reusable modulesare not identified and documented properly and so cannot be ported toother designs. Also, the lack of well-defined process phases leavesengineers and managers with no useful basis of estimation for futureFPGA design efforts.

He said there are several benefits of the well-defined anddocumented VHDL-based FPGA design process desribed in his class.”First, the process may be tailored to emphasize some phases anddeemphasize others resulting in increased efficiency and reduced risk,”he said. “Second, it is well structured and includes design reviewsthat force FPGA design issues to be discovered and addressed early inthe design cycle when the impact of design changes is minimized. Third,it provides complete, accurate design documentation for FPGAintegration and test.”

This VHDL-based FPGA design process also benefits future FPGA designefforts within an organization. “Common, reusable VHDL modules areidentified and documented so they may be easily ported to other FPGAdesigns,” said Weyand. “Also, the well-defined and clearly delineatedprocess phases allow current design efforts to provide a useful basisof estimation for future FPGA design efforts. Finally, less experiencedFPGA designers are given a proven and well-documented roadmap tofollow.”

To sign up to attend the Embedded Systems Conference in Boston, goto the ESC Registration Page.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.