Do we need hardware standards? -

Do we need hardware standards?


Hardware standards enable programmers to work with a known quantity instead of a moving target. But freezing any system stifles creativity and slows advancements in technology. Standardizing on hardware is a mixed blessing and the benefits depend on whom you ask.

Remember the old joke: the great thing about standards is there are so many to choose from. Some standards are indispensable (resistor color codes come to mind) while others seem like thinly veiled marketing ploys. Standards can either foster or stifle creativity, depending on which side of the creative equation you're on.

PCs have become a hardware standard. The Macintosh, PlayStation 2, and Palm Tungsten T are all standardized systems as well. Their hardware and software can be developed separately, by separate teams, with some assurance that the combination will work.

In contrast, embedded systems usually have their hardware and software developed simultaneously. It's a more symbiotic relationship: the programmers need early hardware to develop code, and the engineers need early code to bring up the hardware.

In an episode that highlights the difference between embedded systems and computer systems, a colleague working for a microprocessor company recounted how his firm encouraged Sony to use its chip in the upcoming PlayStation 2. The salesman proudly told Sony's executives that future microprocessors would be much faster than today's. The Sony representatives were horrified. The last thing they wanted was for PlayStations to get faster over time. Their entire business model depends on the hardware remaining exactly the same over years of manufacturing. The thought of upgrades or “speed bumps” was antithetical to the system's success. The processor company didn't get the business.

What if embedded systems hardware were standardized? Would standardized hardware compromise efficiency or increase productivity? It's been said that most of the perceived value in embedded systems is in the software not the hardware. So, although standardizing hardware may make programmers more productive, it should make little difference to end users. Granted, this belief is held mostly by those who don't make hardware.

Standardized hardware makes software development quicker and easier and the applications themselves more reliable. It's the only thing that enables software portability. Any system that's going to have a decent third-party software market needs standardized hardware. Holding the hardware steady for a few years might also give programmers time to extract the last bit of performance from the system, rather than always chasing new hardware releases while scrambling to meet ship dates. Static hardware removes one of two dynamic variables from the equation.

That's all well and good—if you're a programmer. I suspect many hardware engineers would make the opposite argument: that operating systems and applications should hold still for a few years and let the hardware evolve under them. In this, the chip geeks have history on their side. Everyone's heard of Moore's Law, the observation that chips double in complexity every 18 months or so. It would seem ludicrous to put the brakes on that kind of advancement. As if we even could.

How valuable would standard hardware systems be to you? And who should set these standards? Industry groups or trade organizations could appoint committees to set their own norms, or we could let industrial competitors battle it out. Write and let me know what you think; we may publish some of the responses.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.