Trends in embedded software design
As the magazine that catered to embedded systems programmers closes, the future lies in less hands-on programming and more auto-generated code.
The early days
In the early 1990s, as now, the specialized knowledge needed to write reliable embedded software was mostly not taught in universities. The only class I'd had in programming was in FORTRAN; I'd taught myself to program in assembly and C through a pair of hands-on labs that were, in hindsight, my only formal education in writing embedded software. It was on the job and from the pages of the magazine, then, that I first learned the practical skills of writing device drivers, porting and using operating systems, meeting real-time deadlines, implementing finite state machines, the pros and cons of languages other than C and assembly, remote debugging and JTAG, and so much more.
In that era, my work as a firmware developer involved daily interactions with Intel hex files, device programmers, tubes of EPROMs with mangled pins, UV erasers, mere kilobytes of memory, 8- and 16-bit processors, in-circuit emulators, and ROM monitors. Databooks were actual books; collectively, they took up whole bookshelves. I wrote and compiled my firmware programs on an HP-UX workstation on my desk, but then had to go downstairs to a lab to burn the chips, insert them into the prototype board, and test and debug via an attached ICE. I remember that on one especially daunting project eight miles separated my compiler and device programmer from the only instance of the target hardware; a single red LED and a dusty oscilloscope were the extent of my debugging toolbox.
Like you I had the Internet at my desk in the mid 1990s, but it did not yet contain much information of use to me in my work other than via certain FTP sites (does anyone else remember FTPing into sunsite.unc.edu? or Gopher?). The rest was mostly blinking headlines and dancing hamsters; and Amazon was merely the world's biggest river. There was not yet an Embedded.com or EETimes.com. To learn about software and hardware best practices, I pursued an MSEE and CS classes at night and traveled to the Embedded Systems Conferences.
At the time, I was aware of no books about embedded programming. And every book that I had found on C started with "Hello, World", only went up in abstraction from there, and ended without ever once addressing peripheral control, interrupt service routines, interfacing to assembly language routines, and operating systems (real-time or other). For reasons I couldn't explain years later when Jack Ganssle asked me, I had the gumption to think I could write that missing book for embedded C programmers, got a contract from O'Reilly, and did--ending, rather than starting, mine with "Hello, World" (via an RS-232 port).
In 1998, a series of at least three twists of fate spanning four years found me taking a seat next to an empty chair at the speaker's lunch at an Embedded Systems Conference. The chair's occupant turned out to be Lindsey Vereen, who was then well into his term as the second editor-in-chief of the magazine. In addition to the book, I'd written an article or two for ESP by that time and Lindsey had been impressed with my ability to explain technical nuance. When he told me that day he was looking for someone to serve as a technical editor, I had no idea it would end up being me.
Becoming and then staying involved with the magazine, first as technical editor and later as editor-in-chief and contributing editor, has been a highlight of my professional life. I was a huge fan of ESP and of its many great columnists and other contributors in its first decade and believe my work helped make it an even more valuable forum for the exchange of key design ideas, best practices, and industry learning in its second. And, although I understand why print ads don't support it anymore, I am nonetheless saddened to see the magazine come to an end.
Reflecting back on these days long past reminds me that a lot truly has changed about embedded software design. Assembly language is used far less frequently today; C and C++ much more. EPROMs with their device programmers and UV erasers have been supplanted by flash memory and bootloaders. Bus widths and memory sizes have increased dramatically. Expensive in-circuit emulators and ROM monitors have morphed into inexpensive JTAG debug ports. ROM-DOS has been replaced with whatever Microsoft is branding embedded Windows this year. And open-source Linux has done so well that it has limited the growth of the RTOS industry as a whole--and become a piece of technology we all want to master if only for our resumes.
So what does the future hold? What will the everyday experiences of embedded programmers be like in 2020, 2030, or 2040? I see three big trends that will affect us all over those timeframe, each of which has already begun to unfold.
Trend 1: Volumes finally shift to 32-bit CPUs. My first prediction is that inexpensive, low-power, highly-integrated microcontrollers--as best exemplified by today's ARM Cortex-M family--will bring 32-bit CPUs into even the highest volume application domains. The volumes of 8- and 16-bit CPUs will finally decline as these parts become truly obsolete. Although you may be programming for a 32-bit processor already, it remains the situation that 8- and 16-bit processors still drive overall CPU chip sales volumes. I'm referring, of course, to microcontrollers such as those based on 8051, PIC, and other instruction set architectures dating back 30 to 40 years. These older architectures remain popular today only because certain low-margin, high-volume applications of embedded processing require squeezing every penny out of BOM cost.
The limitations of 8- and 16-bit architectures impact the embedded systems programmers who have to use them in a number of ways. First, there are the awkward memory limitations resulting from limited address bus widths--and the memory banks, segmenting techniques, and other workarounds to going beyond those limitations. Second, these CPUs are much better at decision making than mathematics--they lack the ability to manipulate large integers efficiently and have no floating-point capability. Finally, these older processors also lack the ability to run larger Internet-enabled operating systems, such as Linux, as well as the security and reliability protections afforded by an MMU.
There will, of course, always be many applications of computing that are extremely cost-conscious, so my prediction is not that they go away but that the overall price (including BOM cost as well as power consumption) of 32-bit microcontrollers based on improved instruction set architectures and transistor geometries will win on price. That will put the necessary amount of computing power into the hands of some designers and make the job easier for all of us.
Trend 2: Complexity forces programmers beyond C. My second prediction is that the days of the C programming language's overwhelming dominance in embedded systems are numbered.
Don't get me wrong, C is a language I know and love. But, as you may know firsthand, C is simply not up to the task of building systems requiring over a million lines of code. Nonetheless, million-plus line of code systems is where the demanded complexity of embedded software has been driving our programs for some time. Something has to give on complexity.
Additionally, there is the looming problem that the average age of an embedded systems developer is rapidly increasing while C is no longer generally taught in universities. Thus even as the demand for embedded intelligence in every industry continues to increase, the population of skilled C programmers is on the decline. Something has to give on staffing, too.
But what alternative language can be used to build real-time software, manipulate hardware directly, and be quickly ported to numerous instruction set architectures? It's not going to be C++ or Ada or Java, for sure--as those have already been tried and found lacking. A new programming language is probably not the answer either, across so many CPU families and with so many other languages already tried.
Thus I predict that tools that are able to reliably generate those millions of lines of C code automatically for us, based on system specifications, will ultimately take over. As an example of a current tool of this sort that could be part of the trend, I direct your attention to Miro Samek's dandy open source Quantum Platform framework for event-driven programs and his (optional) free Quantum Modeler graphical modeling tool. You may not like the idea of auto-generated code today, but I guarantee that once you program for a state machine framework, you'll see the benefits of the overall structure and be ready to move up a level in programming efficiency.
I view C as a reasonable common output language for such tools (given that C can manipulate hardware registers directly and that every processor ever invented for the mass market already has a compatible compiler). Note that I do expect there to be continued demand for those of us with the skills and interest to fine tune the performance of the generated code or write device drivers to integrate it more closely to the hardware.
Trend 3: Connectivity drives importance of security. We're increasingly connecting embedded systems--to each other and to the Internet. You've heard the hype (e.g., "Internet of things" and "ubiquitous computing") and you've probably already also put TCP/IP into one or more of your designs. But connectivity has a lot of implications that we have mostly not dealt with yet. Probably the most obvious of these is security.
A connected device cannot hide for long behind "security through obscurity" and, so, we must design security into our connected devices from the start. In my travels around our industry I've observed that the majority of embedded designers are largely unfamiliar with security. Sure some of you have read about encryption algorithms and know the names of a few. But mostly the embedded community is shooting in the dark as security designers, within organizations that aren't of much help. And security is only as strong as the weakest link in the chain.
This situation must change. Just as flash memory has supplanted UV-erasable EPROM, so will over-the-air patches and upgrades take center stage as a download mechanism in coming decades. We must architect our systems first to be secure and then to be able to take downloads securely so that our products can keep up in the inevitable arms race against hackers and attackers.
And that's a wrap
Whatever the future holds, I am certain that embedded software development will remain an engaging and challenging career. And you'll still find me writing about the field at Embedded.com, EmbeddedGurus.com, and on Twitter at http://twitter.com/embeddedbarr.
Michael Barr is CTO of Barr Group and a leading expert in the architecture of embedded software for secure and reliable real-time computing. Barr is also a former lecturer at the University of Maryland and Johns Hopkins University and author of three books and more than sixty five articles and papers on embedded systems design. Contact him at firstname.lastname@example.org or read his blog at http://embeddedgurus.com/barr-code.
This content is provided courtesy of Embedded.com and Embedded Systems Design magazine.
See more content from Embedded Systems Design and Embedded Systems Programming magazines in the magazine archive.
This material was first printed in May 2012 Embedded Systems Design magazine.
Sign up for subscriptions and newsletters.
Copyright © 2012
UBM--All rights reserved.