Predictions - Embedded.com

Predictions

Pundits love to make predictions when a new year arrives. It's fun tospeculate about the near-term future, and not nearly as dangerous aslooking at a horizon decades away. Hey, I'm still waiting for theflying cars we were promised in the 50s by Popular Mechanics andothers.

Some predictions are pretty safe. Multi-coreis here today and will grow quickly. Need to drop a couple of CPUs intoan FPGA? Consider the NIOS II from Altera. Want a couple of hundred inan ASIC? Tensilica is doing this today. Steve Leibson in “TheEnd of Moore's Law,” figures that SoCs today have 16 processors perchip, which is expected to grow to an astonishing 359 in a decadeusing the same silicon real estate.

Other prognostications are depressing. Firmware sizes double everyten months to two years, depending on which surveys you believe. Butthe tools evolve at a much slower rate so we're doing more with less.The optimist in me notes that some ” no one knows how much ” of thegrowth in code comes in the form of off-the-shelf packages that don'trequire much in-house development work, though integration is always anissue.

In my opinion, there are three new technologies that will become amajor factor in most developers' lives. Two are already here in oneform or another. The third isn't, but will someday make some toolcompany rich. I doubt if we'll see major impact from any of these threein 2007, but they're not wild-eyed speculations like flying cars.Figure on a window of a few years.

The first is static analysis. It's been around forever in Lint,but now much more sophisticated tools have started to appear. The SPARKlanguage (from Praxis High Integrity Systems) comprises a toolsetthat includes a very sophisticated analyzer. Polyspace, Green Hills,Coverity, Klocwork and others also sell these sorts of tools.

I want a debugger with only two buttons: Find Bug and Fix Bug.Static analyzers promise some level of the first. Pour your sourcethrough the tool and it identifies many sorts of problems that lie wellbeyond the capability of Lint. The technology as it stands today won'tfind 'em all, and I doubt we'll ever get to that point. But very smartanalysis can find many problems cheaply. The technology will improve,it will find more kinds of defects, and costs will come down.

The problem today is that these tools consume vast amounts of CPUcycles. Maybe vendors will take advantage of grid computing todistribute the workload over other more or less idle machines on thenetwork.

The second technology which holds so much near-term promise isvirtualization. Once upon a time we called this simulation, which hasfor decades been rather a joke. Though some simulators did work well,the problem of modeling peripherals and other hardware kept it out inthe cold.

Then a few years ago Keil came out with a jaw-droppingsimulator product that includes models for most common I/O devices. Itseems to work well.

Now more extensive products are available. Virtutech's Simics, forinstance, is a full-system simulator, ah, virtualization. While notcheap it potentially gives developers a complete, working, model oftheir environment long before the hardware is ready. Though “co-design”is a fun buzzword marketing droids love to bandy about, it's thwartedif we can't start testing early, especially considering the iterativedevelopment processes so many use.

As virtualization becomes more mainstream I predict widely-sharedlibraries, perhaps under an open-source license, will mean we rarelyhave to create a simulation for a new peripheral. That, in turn, putspowerful pressure on hardware engineers to avoid the NIH syndrome anduse parts for which such libraries exist.

The third technology I believe will change is test, that uglystepchild of system development that consumes vast developer resourcesyet is so resistant to automation. Note some might combine test andvirtualization; I chose not to, as at some point we must run the codeon real hardware. Upgrades and minor patches will be checked on thehardware as well.

PC programmers have lots of nice tools to automate regression test,which means the agiler's dream of making a change and pushing onebutton to test everything is possible. Embedded systems are different.Someone has to watch the displays and press the buttons.

Debug and test consume about half a product's development time.Static analysis will shorten that considerably as the technologymatures. Virtualization will help us perform early and mid-termtesting. I don't know what will happen to improve testing on realhardware but something will come along. The demand is simply too greatfor a solution to not materialize. Maybe it will be a businesssolution: outsource testing to China. Perhaps some entrepreneur willintroduce novel technology that automates our tests. But something iscoming.

What do you think? What are the great new things that will effectembedded development in the next few years?

Jack G. Ganssle is a lecturer and consultant on embeddeddevelopment issues. He conducts seminars on embedded systems and helpscompanies with their embedded challenges. Contact him at . His website is .


ok…

A few predictions into that old predictor heap.

– ARM7 and ARM9 processors will continue steady growth in 2007

– UML, while a decent and good methodology, will continue to remain flat

– DSP technologies, especially TI's dominance will continue onward

– Mixed signal will get a serious shot in the arm as more wireless applications get deployed

– And the biggest one? Why that is the Internet, and Internet-related tools

Before everyone guffaws about that last one; I can say this:

– More and more developers are using script-type languages such as Perl to help automate build and test

– Linux and the Internet appear to be joined at the hip

– The main consumer of Internet bandwidth will NOT be the browser, but will primarily be mundane machine-2-machine uses by embedded folks like you and me!

– Ken Wada


I predict the rise of true system engineering through the use of SystemC component modeling and simulation. Systems engineers will prove that their system designs will perform to spec through these simulations before committing the design to silicon.

– Steve Leibson


You missed SIP and VOIP.

If you think that the Altair through PC “Wild West Days” are gone, well I think not.

The telephone industry is going to go crazy. Just think of all the wild and crazy ideas yet to be invented to serve all the various industry segments that are ALREADY data-centric. It blows my mind. It will be software and connectivity that changes things.

– Bob Kondner


Coding will move to a higher level.

Eventually we will have more tools such as UML and the like that will make system development easier and more intuitive. Gone will be the days on how to format a print statement, construct a copy constructor or worry about pointer arithmetic. Code generation tools will handle the layer closer to the processor/architecture.

Tomorrows IDEs will have code generation, test, and analysis rolled into one.

Since more and more code will be machine generated and never read by an actual human, tools will have to be certified to pass the toughest industry standards.

I remember about 10 years ago a software engineer could reasonably expect to understand most of the code in a typical system. Now it is not even possible to LOOK at all the code in a system, never mind understand it all or come up with a strategy to verify and validate it.

Deployment of higher-level development tools is inevitable. Today, we seldom have to look at assembler code, and tomorrow we'll seldom look at C.

– Joe Sallmen

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.