Commentary: EDA is dead, software lives - Embedded.com

Commentary: EDA is dead, software lives

EDA is dead. Yes, it's still needed: We're not going to go back to designing chips with rubilith again. But the reality is that fewer and fewer chips are being designed, and most of the differentiation of a system is moving into the software. Even Gary Smith, the longtime Gartner analyst for EDA, proclaimed at the 2006 DAC that “It's the software, stupid.” And then Gartner laid off Gary's EDA group, another symptom.

Historically, semiconductor economics have been the single driver behind EDA, especially during the 1980s and 1990s. By avoiding moving to the next process node, a company faced getting trampled by competition with offerings at half the cost. This is no longer the case. The only reason to consider a 45nm design is because you need something that you can't get otherwise, primarily density of transistors since even speed and power are not clearly moving in the right direction with each process node.

Most design teams operating at 0.13 microns are never going to do a 45nm design and don't require the EDA tools. It's too difficult, too expensive and, besides, 0.13 microns is just fine. The one exception is if your production volumes are extremely high. Then, and only then, semiconductor economics are in your favor. But most markets just don't need that many chips.

Additionally, increasing performance with reasonable power is a problem with no solution and the challenge has fallen to software engineers in the form of multicore chips. Getting software to run on such chips automatically has been a research area for 40 years, so a general solution is unlikely to be imminent.

We are moving toward the EDA nightmare — a small number of chips manufactured in enormous volume and customized by software. Already many chip companies have several times as many engineers working on the software compared to the chip. Since most of the differentiation of a system is moving into the software, consequently, the more significant barrier to production is creating software that is not late or full of bugs.

The embedded industry needs to invest more in innovative software development methods just as semiconductor productivity was dramatically improved with new design technology. Today, there are a number of tools and techniques already available that can have a dramatic increase in productivity and quality, whether the software runs on a chip, a board or an entire rack of boards.

Virtualized software development, in which the software is divorced from the hardware, is one approach that offers enormous benefits. By starting development before chips or boards are available, problems are found early, where resolution is least expensive. Entire multiprocessor systems can be stopped and restarted and code can be run backward, eliminating the concept of a six-week bug.

Testing can be more automated, especially for the most complex systems that otherwise require technicians to configure systems for test. Multicore systems can be stressed by varying the clock rates to different cores, or varying the number of cores, in a way that is impossible with real chips.

Companies must focus on improving software development in order to meet deadlines and ensure the quality of the end product. With the added challenge of parallel processing, developers would benefit from a virtual environment where everything is deterministic, everything can be seen, everything can be controlled and unusual stresses can be imposed. That's why there's virtualized software development.

Paul McLellan is vice president of marketing at Virtutech, Inc. and has spent over 20 years working in the area of tools for system and integrated circuit design.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.