Design Con 2015

2008 to 2028: twenty years in embedded systems

October 31, 2008

JackGanssle-October 31, 2008

Welcome to the 40th anniversary issue of Embedded Systems Design "magazine." That last word is an anachronism only oldsters recognize. For three decades ESD produced a magazine, which, in its early years, was printed on "paper" (organic material formed into flat sheets that could be marked with symbols) and was actually moved physically from manufacturing facilities to engineers' desks. At one point the "postal system" used over a quarter million vehicles to move materials, mostly junk mail, to peoples' homes and offices.

Magazines evolved from this printed form to, briefly, an all-electronic version that exactly mimicked the printed version, to a streaming feed of random thoughts (see "blog" and "RSS") created by anyone with an ax to grind. That incessant babble eventually approached a zero-information state and was replaced by today's holographic VR Embedded Systems Design Space in which most users immerse themselves while their automover drives them home.

In ESD's founding year of 1988, embedded technology was dominated by slow 8- and 16-bit CPUs; the highest-performance processors available used 1,500-nm technology and could not top 33 MHz. Those numbers were laughable by ESD's 20th anniversary and are just the stuff of dim history today. Also history: even in 2008, most embedded systems were designed as a collection of separate integrated circuits connected by tracks on a printed-circuit board.

How much has changed!
Perhaps one of the biggest differences in embedded system developer's work over the last two decades has been the dissolution of the notion of hardware and software as separate entities. When was the last time you actually wrote code or designed a circuit? Graphical modeling tools and FPGAs signaled the inevitable, although few realized it at the time. Bigger, cheaper programmable logic, coupled with problems of unimaginable complexity meant engineers eventually started designing systems as a whole: blocks that had certain functions and interacted in well-defined ways. Tools translated these concepts into an optimum mix of transistors and code, balancing performance, size, and cost concerns with the hardware/software mix. No one looks at code or transistors anymore, we work entirely at a 3D graphical problem-domain model level.

Once bulky power supplies converted power delivered by a huge continentwide grid of power-generating stations to levels appropriate for the particular application. Some portable devices did operate off of "batteries" for a short time, but had to be frequently reconnected to the grid to be recharged.

When in 2014 Électricité de France complained that the Large Hadron Collider had stopped both making contractually required payments for electricity and consuming any of the giant company's power, LHC scientists were forced to admit they had accidentally created both a micro black hole and an antihole, and had learned to harness the resulting matter/antimatter reaction to power the lab. Fear of global destruction quickly changed to an explosion of venture funding of MBH (micro black hole) power startups. The worldwide depression brought on by housing's collapse six years earlier changed to a new economic bubble, one we still ride today.

< Previous
Page 1 of 3
Next >

Loading comments...

Parts Search Datasheets.com

KNOWLEDGE CENTER