Design Con 2015

The embedded cloud: IT at the edge

Rajive Joshi, Real-Time Innovations Inc.

August 31, 2011

Rajive Joshi, Real-Time Innovations Inc.August 31, 2011

Here's another way to deal with device clouds and their data needs, using the Data Distribution Service--an open-standard-based data bus.

Embedded devices are generating increasing volumes and variety of data. These devices are typically found on the "edge" where the "machine" meets the real-world. Many examples of embedded systems "on-the-edge" can be found across a range of industries such as industrial automation, process control, manufacturing, mining, farming, energy, medical, consumer electronics, air-traffic control, transportation, warehousing, gaming, and home automation, among others.

With the advances in electronic design, edge devices are becoming available in multiple form factors at low cost, with low power, and minimal management requirements. The deployment of the edge device is on rise, and this trend continues to gather momentum. Examples of edge devices include sensors, meters, counters, cameras, or equipment that interfaces with the constantly changing physical world.

Historically, edge-device networks have been isolated from the rest of the corporate information-technology infrastructure and the Internet. With the emergence of cloud computing, IT compute and storage resources can be provisioned on-demand, without human intervention. The resource usage can be measured, and the resource pool can be elastically scaled up or down to match the demand. This is resulting in a drastic reduction in the cost of IT, and creating an opportunity for new types of applications that become possible by connecting the edge devices to the cloud core.

The elastic and scalable cloud-computing infrastructure is perfect complement for processing massive amounts of data generated by the edge devices, that can vary based on the real-world demands. Once the edge data from a variety of sources enters the cloud infrastructure, "Big Data" techniques can be used for intelligent processing of massive data volumes. The new types of applications that become possible include monitoring and management of the edge devices, real-time analytics and data mining, ability to match pricing with demand, condition-based predictive maintenance, controlling/influencing real-world behavior, and new value-add services.

I refer to this emerging landscape as the embedded cloud. The edge devices provide a local "point of presence," that provides a local, possibly redundant, interface to the real world. They may also implement localized microlevel control. In addition, edge devices can generate large amounts of real-time data that can be processed in real-time by the massively scalable cloud compute and storage infrastructure to produce "instant" responses to the real-world phenomenon. The macrolevel feedback control creates new opportunities for the marketplace.

The problem: Data in motion
Clearly, a key challenge that must be addressed is: "How does one move the data from the edge devices into the cloud platform of choice?"

Several aspects of data distribution must be addressed, including:

  • Discovery of edge devices and their data feeds.
  • Handling a variety of data types and rates.
  • Scaling to accommodate large number of edge devices.
  • Efficient resource usage to handle fan-out of data, so that only the interested consumers get a data feed.
  • Timely and deterministic delivery of feed updates.
It's critical to select a software framework that can accommodate the key aspects above. In addition, the software framework should also accommodate easy integration of existing and stove-piped systems.

When selecting a software framework, several key business considerations must be addressed as well. These include:
  • Architecture risk. Does the software framework scale? Can it accommodate future needs? Would it be flexible for a variety of data feeds that may have very different quality of service requirements?
  • Resource efficiency. How efficiently are the local compute and network storage resources utilized? What is the overhead? What are demands on the physical resources?
  • Time to market: build or buy? What tools are available to improve developer productivity?
  • Coupling. How tightly is the application logic coupled to the network and the platform? To other applications? Can an application be moved around in a location independent manner, thus decoupling the hardware and software decisions?
  • Lock-in. Can the software framework support the multiple form factors and software stacks used across disparate edge devices? Can the software framework be deployed in the variety of emerging cloud platforms? Are the APIs proprietary or standards-based? If an acquisition event were to happen, how easy would it be to integrate with another similar type of system?

The choice of the software framework for data delivery has a profound impact on the scalability, performance, and longevity of the application architecture.

A solution
Data Distribution Service (DDS) is emerging and as an open-standard-based "data-bus" for addressing the above needs in the embedded cloud landscape. DDS has matured as a formally-defined industry standard over the last several years and is already in wide usage on edge networks across a range of application areas. It is currently used in hundreds of mission-critical applications spanning robotics, unmanned vehicles, medical devices, transportation, combat systems, finance and simulation. It's now being applied to this new class of embedded-cloud applications in process control, home automation, mining, farming, energy, and transportation. The Object Management Group (OMG), a consortium of industry vendors and users, maintains the DDS standard.

The DDS standard defines a "data-centric publish-subscribe" programming model for distributed software data bus. DDS standardizes the programming API for data distribution to ensure application portability, and the wire-protocol to ensure application interoperability. Like a database, DDS naturally decouples the data from the application logic (hence "data-centric"). Since data has its own identity, it can be stored, manipulated, enriched, and transformed in the "data cloud." An application can simply publish an update to a data feed. The DDS implementation keeps track of who needs the update, and delivers it only to the applications that have subscribed to the data-feed. Thus, the publishing and subscribing applications are decoupled from one another, and can be scaled up or down independently. Discovery is built into the DDS APIs--thus an application can discover the available data-feeds, and publish and/or subscribe only the ones of interest. Application can be added and removed dynamically on demand. Furthermore, an application can attach delivery quality of service to a data-feed. DDS takes on the responsibility of managing the delivery and notifying the applications if the delivery contract (say, deadline) could not be met. Since DDS relieves the application programmer from burden of managing the data, the application code is drastically simplified.

Multiple interoperable DDS implementations exist in the marketplace today. Leading DDS implementations have made significant investments in "implementation optimizations" to support the breadth of data distribution requirements in the embedded-cloud landscape, and are well positioned for supporting the new genre of embedded cloud applications. For example, Real-Time Innovations' DDS implementation is "peer-to-peer" and does not require a server or a broker--an application simply links in the DDS library for the platform and programming language of choice (C/C++, Java, .Net, Ada). The library is available across a variety of operating systems (Linux, Windows, VxWorks, Integrity, QNX), and can optimize system resources by filtering out unwanted updates on the publisher side. Besides several productivity tools, the RTI framework provides the ability to monitor, record, playback, route, transform, persist the data in the cloud independently of the applications, and also a framework for integrating legacy sources and sinks of data.

The embedded-cloud application architecture enabled by DDS is scalable and elastic. It can respond to demand for resources in real time. The architecture is loosely coupled and flexible to accommodate new requirements, and naturally supports real-time data feeds. Intelligence can be added on the wire independently of the applications. In practice, systems adopting DDS have seen an orders of magnitude improvement in performance on the same hardware infrastructure.

On the business front, DDS reduces the risk of building an embedded cloud by providing a solid framework for moving real-time data from the edge to the cloud core. It also reduces the cost of development by providing higher-level data-centric abstractions that simplify application code and are available on a variety of platforms. Market studies have shown that systems adopting DDS report faster time to market with fewer developers, than in-house approaches. Also, being an open-standard with multiple implementations assures freedom from vendor lock-in.

Join the Cloud

The embedded cloud is about connecting the edge devices to the IT infrastructure and developing a new genre of applications that can make macro-level decisions about the real-world environment and offer value added services. DDS is field-proven open-standards-based software framework that and naturally enables this new class of application architectures while meeting the challenging real-time data delivery requirements. DDS reduces the risk, schedule, and cost of development of applications for the embedded cloud, and is seeing adoption in a variety of "embedded cloud" applications.

Rajive Joshi is a principal solution architect at Real-Time Innovations, Inc., where he consults with customers building high-performance distributed systems. His technical expertise spans distributed real-time systems, embedded systems, robotics, evolutionary computing, and sensor fusion. He has 18+ years of experience in software architecture, design, and implementation of distributed real-time systems and middleware. He is a coinventor of seven patents and has coauthored 30+ publications, including the book Multisensor Fusion: A Minimal Representation Framework. Dr. Joshi holds a doctorate and masters in computer and systems engineering from Rensselaer Polytechnic Institute, and a bachelor's in electrical engineering from the Indian Institute of Technology at Kanpur. He is a member of the IEEE, ACM, and AIAA.


This article provided courtesy of Embedded.com and Embedded Systems Design magazine.
See more articles like this one on Embedded.com.
This material was first printed in Embedded Systems Design magazine.
Sign up for subscriptions and newsletters.
Copyright © 2011
UBM--All rights reserved.

Loading comments...

Parts Search Datasheets.com

KNOWLEDGE CENTER