What chips are other embedded systems developers using in their systems? In our annual survey, your peers reveal their vendor and size preferences. Jack Ganssle interprets the data here.
Marketers eagerly await each year's State of the Embedded Market Survey to evaluate how their products stack up against the competition's in the gritty real world of product design. They comb through arcane results that show how a company decides to buy a particular product. Working engineers care little that 13% of respondents claim corporate management exerts “some influence” on processor selection, or that 46% of projects use power driver ICs.
But a lot of the data is fascinating and may even persuade us to change our buying habits. If that cool operating system we've been lusting after has only a 1% market share, then designing it into the next project is quite risky indeed.
The survey covered a wide range of subjects. Future issues of Embedded Systems Design will explore other findings, but for now let's look at the microprocessor results.
Over 1,200 people responded to the survey, which took place in March of this year. Readers from Embedded Systems Design , EE Times , and Embedded Systems Europe participated, as did attendees of the Embedded Systems Conference. Responses were fairly evenly split between software, systems, and hardware people.
In a single lifetime computers have changed from gigantic multi-megabuck machines affordable only by the largest institutions to nearly free bits of silicon dust. Three percent of us responded that we're using 10 or more processors in a product. Only 56% reported just a single processor per product; most average just over two.
The survey didn't explore the max number used, but I know of at least one application that uses over 180 CPUson a single ASIC. Indeed, 6% reported using a single chip with multiple cores. Sixty-two percent have customizable logic (FPGAs, ASICs, and so forth) in their products; of those a solid third claim to be putting a CPU inside those logic chips.
Those numbers are CPUs per product, of course, which doesn't reflect actual numbers of chips used. Some very high-volume applications (such as cell phones and hard-disk drives) routinely employ both a 32-bit CISC part and, typically, a DSP engine as well. A single “product” response may represent millions of manufactured devices.
Why do we slam multiple micros into our products? Sixty-eight percent of those using more than one CPU employ different kinds of processors, for instance, a DSP and a CISC, or a big powerful engine coupled with one or more smaller dedicated controllers. That's up a lot from last year's 55%. The shift is coming at the expense of using many instances of the same chip, which is off eight points from last year's 34%.
Size matters. Remember when the first 32-bit parts appeared? Big, power-sucking, and expensive they were reserved exclusively for desktops and workstations. Few embedded applications could afford them. The survey definitively destroys the notion of embedded as the world of small CPUs. A whopping 54% of us use 32 bitters today, as Figure 1 shows.
Perversely, four bits refuses to die. Long relegated to controllers for white goods (washing machines, refrigerators), even these apps now require far more smarts than of yore. But 1% of respondents (about a dozen people) reported using these brain-dead parts even today! Note that last year we didn't ask this question so we don't know what the trend might be, but I sure hope it's down.
Despite years of analysts predicting their demise, 8- and 16-bit parts continue to hold their own and in fact are slightly up from last year. Eight-bit parts will always hold a strong position since some applications are so cost-sensitive that a big honking 32 bitter will yield an uncompetitive product.The data is a harbinger of the future: 64-bit parts hold a 6% share, exactly a third the size of the 8- or 16-bit market.
Although 8% of the respondents reported their ICs poke along at clock rates under 10MHz, nearly halfsome 48%run at 100MHz or higher, as Figure 2 shows. Unsurprisingly that correlates pretty closely with 32-bit usage, though it seems a handful of designers clock their high-end parts somewhat more economically. But at these great speeds a number of us are wrestling with difficult printed circuit board design issues sure to make an EMI consultant rich.
Connecting the dots, Jim Turley reported in June (article ID 187203732 on www.embedded.com) that 20% of respondents are using Linux and 27% run some flavor of Windows. Clearly these high-end OS users account for virtually all applications screaming at more than 100MHz.
The big peak for 10 to 99MHz clock rates reflects a sobering reality of this business. While we all want more horsepower, it's expensive. That 1GHz beauty sucks power like a Hummer consuming gas. Thirty-five percent of us reported designing battery-powered applications. Most of those eschew fire-breathing Lithium-Ion Sony cells in favor of low-capacity button batteries or a pair of AAs. Slow clocks push consumption down to the milliwatt level.
Slower parts also ease meeting FCC emissions requirements. Most microcontrollers have relatively low max clock rates, and few 8 bitters are really fast. Once past 50MHz or so the system is limited by memory speeds so cache, another costly pile of transistors, is needed to keep the hungry processor fed with instructions.
We all love being stuck on maintenance. Or something. Turns out 56% of our projects are upgrades or improvements to earlier or existing applications. Of those 54% report changing processors, perhaps to avoid the tedium of using old technology. Exactly half of us have switched CPUs on the current project, whether it's an upgrade or a completely new design. That's a startling number when one considers the investment made in the older technology. Since just over half of those changing reported using an entirely new architecture rather than just a different part in a processor family, many of us are incurring staggering costs to replace all of the tools, IP, and RTOS.
Unfortunately the survey didn't propose “because the new chip is so cool!” as a reason for switching parts. Nor was “rsum padding” an allowable answer. No doubt these choices would skew the results considerably.
But half fingered “better features” as the prime mover for a change, as Figure 3 shows. A third also considered performance. Firmware content is doubling every couple of years, so yesterday's speed demon just can't power today's feature-bloated systems.
Interestingly 31% changed parts in favor of one with a better roadmap. Clearly developers are worried about the cost of changing architectures and wish to design in a family of parts that will enable us to continue doing maintenance for long time to come.
Conversely, those that keep the same CPU did so, 65% of the time, to maintain software compatibility with older products, or, in 59% of the responses, to preserve the toolchain. A third felt it would burn too much valuable time to design in a new part. All of this speaks to the schedule pressure that leaves nary a microsecond for anything not related to getting the product out the door.
It is interesting to find that 60% of us really don't care about getting our tools from the processor supplier. In the olden days it was an article of faith that vendors carried complete tool chains. When Intel startled the world with the 4004 they offered a complete development system. Motorola had a large in-circuit emulator operation and even bought the sizeable Metrowerks compiler company. Today we're content to purchase third-party or open-source tools.But we do care about tools; tools are the most important criteria for selecting a new processor, shown in Figure 4, even ahead of performance and the cost of the silicon.
Despite very cool new approaches to processor design, especially for ASICs and FPGAs, it seems the ability to customize a CPU's instruction set isn't very high on our list of concerns, a feature that hit only 1% of respondents hot button. Is there a disconnect here? Special instructions, which some tools produce automatically to streamline slow code, are an efficient way to get decent performance without extreme clock rates. Few care about this feature while 51% worry about the processor's performance. Perhaps vendors need to hone their message.
According to the survey results, the top CPU vendors are Freescale (36%), Intel (24%), Microchip (18%), TI (18%), and Atmel (14%), in that order. Everyone else is under 8%.
But the numbers are very different when respondents told us who they're considering for their future needs, as Table 1 shows. Though the same top vendors get even stronger responses, FPGA vendors Xilinix, Atmel, and Altera both score better than 10% mindshare for their embedded CPU cores. Though CPU IP hasn't happened in a big way yet, it's clearly not far off.
Table 2 shows the breakdown in anticipated use of 8-bit processors in the near future. Microchip absolutely dominates the category with a total of 73% of responses, pretty astonishing for what was a failed processor that company brilliantly resurrected.
Number two Atmel trails far behind but scores highly for a proprietary architecture. Freescale's small parts do well; perhaps by next year's survey their clever Controller Continuum marketing blitz will tip the scales more in their favor. Intel's 21% share for 8051-style parts will fall to as the company recently announced plans to discontinue these products.
But sum all 8051 vendors and we find that architecture scores a whopping 62% for future designs. In 100,000 years when we've evolved into beings of light someone, somewhere, will be writing 8051 code.
Zilog still manages respectable market share with their ancient Z8/Z80 family that dates back to 1976. The company breathed new life into the Z80 with their popular 8/16 bit eZ80.Who would have figured?
In the 16-bit world, Microchip's PIC24 family didn't even register, putting Freescale (with their HC12 and HC16 parts) firmly in the lead (as Figure 5 shows). TI's ultra-low-power MSP430 garnered identical scores with Microchip's very inexpensive DSP-like processors, perhaps showing the essential tension between low cost and low power. AMD's venerable x86 parts have grown 25% over the last year, proving something about Bill Gate's 1981 comment “nobody will ever need more than 640k.”
In the 32-bit world, ARM CPUs dominate with 72% of respondents considering this part for their next product (see Table 3). Freescale isn't far behind with their PowerPC, 68K, and ColdFire processors. The once-hot MIPS barely registers. x86 processors are still found in a huge number of embedded designs, though Intel plans to kill off both the '386 and '486. Intel recently sold the high-scoring XScale (the brains behind the Blackberry) to Marvell.
Once again we see interest in embedding cores into FPGAs, with Altera and Xilinx both registering above 10%.
Those not using 32 bitters today will surely be tempted tomorrow as applications grow. Respondents were asked about memory sizes; 44% today use more than 16MB, compared with just 31% last year. Firmware content increases rapidly in response to marketing and customer demands for more features and capability.
A few of us already use 64 bitters, something that was inconceivable just a handful of years ago. What amazing bits of technology will we routinely employ in the next decade?
Jack Ganssle () is a lecturer and consultant specializing in embedded systems' development issues. For more information about Jack .