HW/SW co-verification basics: Part 1 - Determining what & how to verify

Jason Andrews

May 23, 2011

Jason Andrews

Human Interaction
Embedded system design is more than a robotic process of executing steps in an algorithm to define requirements, implement hardware, implement software, and verify that it works. There are numerous human aspects to a project that play an important role in the success or failure of a project.

The first place to look is the organizational structure of the project teams. There are two commonly used structures. Figure 6.2 below shows a structure with separate hardware and software teams, whereas Figure 6.3 below shows a structure with one group of combined hardware and software engineers that share a common management team.

Figure 6.2: Management Structure with Separate Engineering Teams
Separate project teams make sense in markets where time-to-market is less critical. Staggering the project teams so that the software team is always one project behind the hardware team can be used to increase efficiency. This way, the software team always has available hardware before they start any software integration phase.

Once the hardware is passed to the software engineers, the hardware engineers can go on to the next project. This structure avoids having the software engineers sitting around waiting for hardware.

A combined project team is most efficient for addressing time-to-market constraints. The best situation to work under is a common management structure that is responsible for project success, not just one area such as hardware engineers or software engineers. Companies that are running most efficiently have removed structural barriers and work together to get the project done. In the end, the success of the project is based on the entire product working well, not just the hardware or software.

Figure 6.3: Management Structure with Combined Engineering Teams
I once worked in a company that totally separated hardware and software engineers. There was no shared management. When the prototypes were delivered and brought up in the lab, the manager of each group would pace back and forth trying to determine what worked and what was broken.

What usually ended up happening was that the hardware engineer would tell his manager that there was something wrong with the software just to get the manager to go away. Most engineers prefer to be left alone during these critical project phases.

There is nothing worse than a status meeting to report that your design is not working when you could be working to fix the problems instead of explaining them. I do not know what the software team was communicating to its management, but I also envisioned something about the hardware not working or the inability to get time to use the hardware. At the end of the day, the two managers probably went to the CEO to report the other group was still working to fix its bugs.

Everybody has a role to play on the project team. Understanding the roles and skills of each person as well as the personalities makes for a successful project as well as an enjoyable work environment. Engineers like challenging technical work.

I have no data to confirm it, but I think more engineers seek new employment because of difficulties with the people they work with or the morale of the group than because they are seeking new technical challenges.

A recent survey into embedded systems projects found that more than 50% of designs are not completed on time. Typically, those designs are 3 to 4 months off the pace, while project cancellations average 11-12%, and average time to cancellation is 4-and-a-half months (Jerry Krasner of Electronics Market Forecasters June 2001).

Hardware/software co-verification aims to verify embedded system software executes correctly on a representation of the hardware design. It performs early integration of software with hardware, before any chips or boards are available.

The primary focus here is on system-on-a-chip (SoC) verification techniques. Although all embedded systems with custom hardware can benefit from co-verification, the area of SoC verification is most important because it involves the most risk and is positioned to reap the most benefit. The ARM architecture is the most common microprocessor used in SoC design and serves as a reference to teach many of the concepts discussed here.

The basics of co-verification
Although hardware/software co-verification has been around for many years, over the last few years, it has taken on increased importance and has become a verification technique used by more and more engineers. The trend toward greater system integration. such as the demand for low-cost, high-volume consumer products. has led to the development of the system-on-a-chip (SoC).

The SoC was defined as a single chip that includes one or more microprocessors, application specific custom logic functions. and embedded system software. Including microprocessors and DSPs inside a chip has forced engineers to consider software as part of the chip's verification process in order to ensure correct operation.

The techniques and methodologies of hardware/software co-verification allow projects to be completed in a shorter time and with greater confidence in the hardware and software. A good number of engineers in studies such as those in EETimes have reported spending more than one-third of their day on software tasks. especially integrating software with new hardware.

This statistic reveals that the days of throwing the hardware over the cubicle wall to the software engineers are ,one. In the future. hardware engineers will continue to spend more and more time on software related issues. This chapter presents an introduction to commonly used co-verification techniques.

Some Co-verification history
Co-verification addresses one of the most critical steps in the embedded system design process, the integration of hardware and software. The alternative to co-verification has always been to simply build the hardware and software independently. Try them out in the lab, and see what happens. When the PCI bus began supporting automatic configuration of peripherals without the need for hardware jumpers, the term plug-and-play became popular.

About the same time I was working on projects that simply built hardware and software independently and differences were resolved in the lab. This technique became known as plug-and-debug. It is an expensive and very time-consuming effort.

For hardware designs putting off-the-shelf components on a board it may be possible to do sonic rework on the board or change some programmable logic if problems with the interaction of hardware and software are found. Of course, there is always the "software workaround" to avoid aggravating hardware problems.

As integration continued to increase. something more was needed to perform integration earlier in the design process. The solution is co-verification. Co-verification has its roots in logic simulation.

The HDL logic simulator has been used since the early 1990s as the standard way to execute the representation of the hardware before any chips or boards are fabricated. As design sizes have increased and logic simulation has not provided the necessary performance. other methods have evolved that involve some form of hardware to execute the hardware design description. Examples of hardware methods include simulation acceleration, emulation. and prototyping. Here we will examine each of these basic execution engines as a method for co-verification.

< Previous
Page 2 of 4
Next >

Loading comments...

Most Commented

  • Currently no items

Parts Search Datasheets.com

KNOWLEDGE CENTER