So this is progress

May 02, 2012

Jim Turley-May 02, 2012

Embedded systems are evolving in exciting ways but our design methods are out of date.

Editor's note: Jim Turley started writing about semiconductors for Embedded Systems Programming magazine in 1997. He wrote the column Significant Bits (archived at www.eetimes.com/4210710) and was editor in chief of ESP/ESD magazine from 2004 to 2006.

Click for more content from ESD May 2012
A "computer" used to be a person. It was a job description, sort of like accountant or actuary. You hired computers to compute things, like lists of numbers or probabilities or statistics. Then these computers were replaced by, well, computers.

Now our computers outnumber us. The average middle-class American home has more than a hundred different microprocessors and microcontrollers scattered around. There are a half-dozen processor chips in every PC (not just the one big one that most people think of), plus at least a dozen MCUs in the family car. A high-end car like an S-class Mercedes has more than a hundred different microcontrollers in it, complete with their own fiber-optic network. A $2 musical greeting card has about as much computing power as the Apollo 11 lunar lander.

We've taken computer technology from the sci-fi laboratory to the bathroom, from the extraordinary to the ridiculous, all in the span of one lifetime. One career, even. Within living memory, people were telling us that a dozen or so computers would satisfy the entire world demand. After all, how many weather-prediction machines do you really need? How many ICBM simulators? It took less than 20 years from the time room-sized computers were predicting missile trajectories to the time we started playing Missile Command at the local arcade. The only things we don't seem to have are the 21st-century flying cars and the jetpacks we were promised.

Now we're just getting silly. Last week I was handed a shaving razor--not an electric shaver, mind you, but a traditional razor with blades--with a microcontroller chip and a battery in its handle. The label on the box proudly proclaims it's the "world's first custom power wet shave razor," and I'm not surprised. The surprising part is that they had to qualify the statement so much. It's not the first powered razor; it's the first custom power wet shave razor, which means other companies beat them to it. I'm almost afraid to try it out. What is this thing going to do to my face? And most of all… why?

Because we can, that's why. Because computers are cheaper than mechanical devices. Consider the old room thermostat: two pieces of bent metal that curl a bit as the temperature changes. What could be simpler or more reliable? An 8-bit microcontroller and 4K of code, apparently. You'd have a hard time finding a traditional bimetallic thermostat these days, and if you did it would probably be for ironic period-correctness, as if you were restoring a 1970s-era house. Would anyone in the 1970s have predicted that you'd use a computer--a freakin' computer!--to control the thermostat?

Computers are cheap because sand is cheap (okay, silicon is cheap) and because of the wonders of mass production. Build enough of something and you can start to amortize the costs across a whole lot of units. Economics 101 is what got us here. Today's microprocessor chips are one of the most complex things ever devised by humankind, yet we toss them out when the batteries die. We inject them into our pet's neck; we stick them in the handle of a razor. That's mass production for you.

A high-end microprocessor today has well over two billion transistors. A big FPGA can have more than six billion. Even low-cost MCUs include over a million transistors. Semiconductor transistors are more plentiful than grains of rice. Congratulations, Silicon Valley, you've out-produced God.

Yet these transistors are cheaper than ink on a page (just ask any publishing company). As a rough guide, transistors cost about $0.00000055 apiece these days. We're asymptotically approaching zero cost. And really, the silicon and other ingredients that go into a transistor pretty much are free. The raw materials are a negligible part of the cost of chip-making. It's the design, the labor, and--most of all--the big shiny factory that cost real money.

Figure on spending $3 billion to $5 billion to erect a new fab. We're talking space-program levels of expenditure here. And that factory will be obsolete in just a few years, so you've got maybe 36 months to make back that $5 billion you invested. Suddenly the cost of a little silicon and copper doesn't seem so bad. Instead, you work like crazy to make chips in high volume so you can amortize that cost across more chips. Before you know it, the chips are almost free. Your customers are happy, your salespeople are happy, and the razor manufacturers are happy.

So where do these gazillions of little CPUs and MCUs go, apart from bathroom electronics? Mostly into living rooms and, um, bedrooms. Televisions, DVD players, game consoles, and all other sorts of living-room electronics are a big part of the whole consumer-electronics industry. The other driving force is sex. Just as sex helped drive the market for digital cameras, cable television, instant (Polaroid) cameras, VCRs, high-speed Internet connections, and mail-order catalogs, it's also a powerful force in embedded systems. Hey, all those digital cameras and cable routers need embedded hardware and software, too. Some estimates say as much as one-quarter to one-third of the Internet's total bandwidth is consumed by porn sites, and that stuff doesn't download itself.

Ironically, the retailers for these shiny new consumer gadgets are being driven out of business by the very technology they're selling. Online shopping, combined with search engines and global shipping, has conspired to turn electronics stores into Amazon showrooms. So long, Circuit City and Best Buy. Hello, UPS driver.

So where do we go from here? Have we, as one patent examiner reportedly said in 1899, already invented everything? Hardly. The number of embedded systems keeps growing as we devise ever-more-clever (or increasingly silly) uses for them. Chipmakers' production numbers have been rising steadily since the beginning of time, and chip companies don't make new chips unless they're selling the old ones. More programmers are employed today than ever before. Somebody is giving these people work.

One good way to predict where tomorrow's embedded systems will go is to look at mechanical systems of today. What mechanical devices could be replaced by electro-mechanical or wholly electronic systems? Looking again at automobiles, some cars are already using magnetorheological suspension components in place of springs. Like the old thermostat, coiled springs may be dead simple and utterly reliable, but they're no match for electro-mechanical equivalents that can adjust a car's ride thousands of times per second. Sure, it seems silly now, but so did in-car radios in the 1950s.

Another angle is to look at today's embedded systems and imagine what they could do with two times, or ten times, the processing power. This approach led us to the smartphones of today. An Android phone or iPhone still makes phone calls in essentially the same way as the early Motorola "brick" phones of decades past; that part hasn't really changed. What's different is the "bonus" processing power for games, apps, phonebook lookup, GPS location, and more. What would you do with an order of magnitude more processing power in the same size, space, and power envelope?

A third growth vector is, sadly, security. Embedded systems are gradually, and grudgingly, adding more security features as they become more centrally integrated into our financial, medical, and personal lives. We've all read about (or experienced) security breaches at banks or credit-card processors. We've overheard stories about smartphone "Bluejacking" or how the world will be overthrown by cyber-terrorists attacking the power grid, etc. New technology leads to new fears and new areas of misunderstanding. Even benign little utility meters with low-wattage, low-bandwidth wireless connections spur outrage from irate (and ill-informed) community groups waving virtual pitchforks. We are afraid of what we don't understand. It was ever thus.

On the plus side, the security angle provides opportunities for a lot of embedded systems programmers and hardware designers. Certainly there is a need for some of this. Financial transactions need to be secured and the data protected better than they are now. Most short-range wireless connections are pretty insecure. In the early days, that was fine. Our first priority as designers is usually just to get the darned system working. After that, if there's time, we'll tweak it to make it better. Now it's time.

The good news is, there are a lot of new chips and software to help us along. Low-cost microcontrollers now often include crypto hardware such as random-number generators and AES or 3DES encryption accelerators. The additional hardware for this is essentially free, as we've seen. But the added value can be priceless. Before long, crypto and security features will be a standard part of most processors and interface chips, and we won't think anything of it.

The day may not be too far off when the security features of an embedded system take up more RAM and ROM space than the "real" code itself. It's not unreasonable that a cable router, for instance, may have 512KB of code to make the router work, but another 4MB of security-related features, some of which will never get used. It seems a shame that so much of the project's resources are "wasted" on nonessential features, but that's where we're headed. Treat it as a compliment: Your product is so valuable that it requires its own anti-theft device.

What doesn't seem to be improving is the situation with development tools. We've seen a million-fold increase in hardware complexity, and similar gains in lines of code per device, yet we still program our systems in C, assembly, or (heaven help us) Java. That's like building the Pyramids of Giza with stone tools. (Wait a minute; make that building a Saturn V with bronze tools.) Our design methods have not kept pace with our design materials. The weak link there is… us.

Creatures of habit are we. Regardless of how advanced our products may be, our methods for designing them are almost medieval. We cling to the same programming languages we used in college and the same circuit-design techniques we used in our first job. We design million-gate ASICs essentially the same way we designed test circuits in undergraduate lab. Why the inertia? Why the lack of progress?

The fault lies not in our stars but in ourselves. We don't like change. Ironically enough, a typical engineer creating tomorrow's newest fast-paced technology is resistant to change. We simply prefer to use the tools and methods to which we've grown accustomed. That's natural--people in other professions behave the same way--but it's also hugely paradoxical. We're more productive when we use (and reuse…) the languages, compilers, debuggers, and bench tests we've used before. We are quite naturally building on our own experience, and isn't that what our employers want? Isn't that experience what makes us more valuable than some trainee fresh out of school?

Well, yes and no. Writer Douglas Adams said, "Everything that's already in the world when you're born is just normal; anything that gets invented before you turn 30 is incredibly exciting and creative and with any luck you can make a career out of it; and anything that gets invented after you're 30 is against the natural order of things and the beginning of the end of civilization as we know it until it's been around for about ten years, when it gradually turns out to be alright really."

There's a reason so many startups are created by young college students with little or no industry experience. They don't have any established history to protect. Whether consciously or unconsciously, we protect and defend our own habits and methods. We know what works and we try to keep it working through repetition. In our 20s we learn; in our 30s we practice; and in our 40s and beyond we dig in our heels. That's a bit of a simplification, but it's truer than many of us would like to admit.

So in the spirit of change, Embedded Systems Design (née Embedded Systems Programming) is making another transformation. It's transmogrifying from its old, familiar form but that's not a bad thing. That's what'cha call progress. But somebody out there better be working on my jetpack.

Jim Turley is the author of seven books, was editor in chief of the Microprocessor Report (a three-time winner of the Computer Press Award), was editor-in-chief of Embedded Systems Design magazine, and is currently editor of Embedded Technology Journal and publisher of Silicon Insider. For more about Jim Turley, go to www.jimturley.com. You may reach him at info@jimturley.com.

This content is provided courtesy of Embedded.com and Embedded Systems Design magazine.
See more content from Embedded Systems Design and Embedded Systems Programming magazines in the magazine archive.
This material was first printed in May 2012 Embedded Systems Design magazine.
Sign up for subscriptions and newsletters.
Copyright © 2012
UBM--All rights reserved.

Loading comments...

Most Commented

Parts Search Datasheets.com

KNOWLEDGE CENTER