As developers move into the net-centric computing market, where even the most deeply embedded device may be connected, all of their traditional concerns are pushed to a more complex level. The main aim is still control, but now, instead of isolated designs and islands of connectivity, that control has to be maintained within a more ubiquitously connected environment.
Some of the new classes at the Embedded Systems Conference (ESC), being held this week in Chicago, will underscore the fact that the traditional issues of writing correct code, debugging code and compilation are bing conditioned by the realities of connectivity.
As the contributed articles in this section and in the scheduled ESC classes also illustrate, engineers will have to adapt existing tools or develop new ones that ensure the same degree of reliability. This means more attention must be paid to tracking the operation of devices deployed in the field, managing the information that results, and sharing both data and confederations of connected embedded devices.
No matter how limited and minimalistic the communications, a number of issues must now be factored into any connected design. First, there is the issue of defining and specifying the elements in a connected design. Here, sophisticated tools such as UML and other environments are useful in defining the much more complex design and are even useful in generating code.
Second, more emphasis needs to be placed on the quality of code and the safety of the design. This is especially true in environments where the day-to-day lives of the ordinary consumer are affected, according to Martha S. Wetherholt. Wetherholt, who is a staff scientist at the NASA Glenn Research Center, will conduct a class on “Putting Safety in the Software.”
In the new computing environment, software is not a subsystem, able to be separated out from the system as a whole, but a co-system that controls, manipulates or interacts with the hardware and with the end user. “Software has its fingers into all the pieces of the pie,” she said. If that pie-the system-can impact your business bottom line, then software safety becomes vitally important.
|Networking is not new to some traditional systems, such as Unix, observes Monta Vista's William Weinberg, but today's universal connectivity will require a new set of development tools.|
“Avoiding this in the development of today's much more complex connected systems requires 'system thinking'-being able to grasp the whole picture,” she said.
Third, it is necessary to take another look at traditional techniques and specifications and see how they can be adapted in combination with other techniques to satisfy new requirements.
Most important, there is a need for a new breed of embedded middleware, said William Weinberg, director of marketing for MontaVista Software Inc. (Sunnyvale, Calif.), a contributor to this report. Such middleware, targeted at developing the code to allow confederations of embedded devices to cooperate, ensures the reliability of the connections, and debugs and tests in an inherently nondeterministic network environment.
“The advantage of working within a Linux environment is that because the Internet and Unix-the Linux progenitor-were developed coincidentally, it is hard to separate the one from the other,” he said. As a result, tools and languages developed for one are adaptable to the other, and much of Unix was developed with computing in a networked environment in mind. “It also provides the common environment in which such middleware tools and systems can be developed and assured of a sufficient market to support them commercially,” Weinberg said.
One possible new framework is the Open System Gateway Initiative (OSGI). The initiative has been developed over the past few years to enable cooperation among confederations of devices and to allow the performance of appropriate management and monitoring functions.