What's important? - Embedded.com

What’s important?

“…[E]very human benefit and enjoyment, every virtue and every prudent act…is founded on compromise and barter,” said 18th-century political philosopher Edmund Burke. He could have been describing embedded systems design.

Embedded engineering is all about tradeoffs. You usually can't have as much memory as you want, or the fastest chip you want, or exactly the tools you want. You never get the schedule or the budget you want. A developer's expertise lies, in part, in making the right tradeoffs. Your boss(es) trust you to balance hardware, software, time, money, space, power, caffeine intake, and who knows what else. You're expected to know what's important.

Lately I've been thinking that hardware isn't important. Now before you round up the lynch mob, hear me out. I'm not saying that processors, memory, printed-circuit boards, power supplies, and all the stuff we design and build aren't important. I'm saying that silicon isn't as important as our own sweat and toil. I think a lot of companies are optimizing the wrong resource. Our time is precious; silicon isn't. We should be throwing more hardware at the problem.

Take Jack Ganssle's article from the May issue (“Subtract Software Costs by Adding CPUs,” ESP , May 2005, p.16). He advocates subdividing large tasks into small tasks and giving each one its own microprocessor. Crazy talk, you say? Sure, a few years ago that would have smacked of heresy. But seriously, it wasn't so long ago that the whole concept of embedded systems with multiple 32-bit chips, megabytes of RAM, and active network interfaces would have seemed crazy too.

The whole RISC-versus-CISC debate from the 1990s took us down the wrong path. RISC dogma advocated simplifying processors at the expense of software. Use less hardware, write more code. But that's exactly the wrong approach. When Moore's Law throws transistors at us faster than we can use them, conserving silicon is precisely the wrong thing to do. Multi-million transistor chips are cheap; the talent to develop multi-megabyte programs isn't. When it comes down to using more hardware or agonizing over how to do without it, I'll throw hardware at the problem every time.

This month you'll also find an article on “assertive debugging,” the notion of having your code monitor itself. Sure, it requires more code—much of which might never be executed—but it pays off in less debugging time, says the author. Sounds like a good tradeoff to me. Throw code at the problem to save your own hours. At some level, the whole purpose of processors and programs was to alleviate the burden of repetitive tasks from us frail humans. I say we do a little more of that.

Reader Response


So, you are trying to bring back all the least frequently used instructions of CISC processors? No, the orthogonality of RISCs is to simplify execution and real estate for performance reasons not to necessarily conserve silicon–it takes twice the memory (the most used silicon resource) to get results! Perhaps the simplicity of RISC instructions could make them more useful and understandable.

Heck with compromise, lets cut through risk vs. reward benefits to actually innovate in ways never thought of.

– Geoff Wattles
Software Engr
Haas Automation

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.