Annual study uncovers the embedded market
Year-on-year trends are sure to grab your attention. People considering Linux? Down. Analog brand loyalty? Nope.
Embedded Systems Design conducts an annual market study, and it's quite comprehensive. Many of you already know this, because you participated in the study. In fact, four of you were the recipients of our random drawing (see "Four lucky winners walk away with USB turntables").
The study was conducted on a global basis, mostly through e-mail in the early part of this year. Over 1,000 embedded systems designers responded to the study, which was sent to a subset of subscribers of this publication (both the domestic and European versions) and EE Times and to attendees of our Embedded Systems Conferences.
For those of you who aren't familiar with the study, we look at such areas as:
• What functions are included in your design?
• Which wireless technology will you deploy?
• How many simultaneous projects are you working on?
• How many people are on your team?
• What types of tools do you use?
• Which processor did you select?
• Which operating system did you select?
I won't go through the complete study here, but I'd like to share some of the parts that I found particularly interesting, troubling, and fascinating. Contact me directly (firstname.lastname@example.org) if you'd like to get more information about the complete study. The real beauty of the study comes from the fact that we've done a nearly identical study for a number of years, and we can show the trends, the evolutions, the end-of-lifes, and so on. That year-on-year data can be very interesting.
Here's one remarkable stat: from 2005 to 2007, the number of new projects fell from 48% to 39%, as Figure 1 shows. This drop is offset nearly equally by a rise in improvements and maintenance. What does that say? Could it be that the projects are so complex that they require more maintenance than in the past? Or are those designs better than previous models, hence living longer lives? In my opinion, it's a combination of those two functions, with the latter being more significant as the cost of producing a new product rises significantly. But it's obvious that it takes longer to program each new generation of processor, simply because more functions are available to the developer. More code means more test, so the each phase of design takes incrementally longer.
To get a reality check on this question, I contacted Contributing Editor Michael Barr. Barr suspects that designers have changed how they define "new." "It used to be that if you added Internet connectivity to your project, but it still did the same thing otherwise, people characterized it as new. Now, everyone that wants an Internet connection has it. So they view such changes, including application updates, as upgrades rather than new projects."