Sensor fusion enables sophisticated next-gen applications
Editor’s Note: In this Product How-To article Rich Collins of Synopsys describes the importance of sensor fusion in connected embedded systems and how the company’s ARC EM4 32-bit CPU-based sensor IP Subsystem allows design of devices with the right performance/power consumption mix.
Although a novelty only a few years ago, sensors are now almost ubiquitous due to the explosive growth of smart devices. The ability to read and interpret environmental conditions such as pressure, temperature, and proximity is featured in many applications. Sophisticated sensor applications combine sensor data from multiple sources to provide a higher order of functionality. This practice is called sensor fusion. Combining an accelerometer, gyroscope, and magnetometer (compass) to create an accurate motion sensor is a prime example of sensor fusion.
Increasing complexity of sensor fusion algorithms requires additional processing capability and software overhead. To reduce impact on the applications processor, sensor functions are being handled by off-chip co-processors as well as integrated, on-chip subsystems. This article highlights some interesting sensor fusion applications, and the increasing need for IP solutions that support the necessary features for integration into a wide range of market applications where sensor fusion algorithms play an important role.
The growth of sensor fusion market
There has been significant growth in systems incorporating sensor fusion technology as more semiconductor suppliers integrate sensor interfaces into their system-on-chips (SoCs). Although motion sensing in smartphones is the most common example of sensor fusion implementation, these functions also are being incorporated into many different applications such as those found in the automotive, consumer electronics, and digital home markets. According to Semico research, the number of systems incorporating sensor fusion is predicted to grow from 400M units in 2012 to over 2.5B units in 2016 – an annual growth rate of almost 60%.
Wearable devices are becoming extremely popular as people become increasingly interested in tracking their personal health and/or fitness goals. From measuring heart rate and sleep patterns to tracking numbers of steps and more advanced work-out monitoring, the scope of personal activities people are logging using wearable devices is astronomical. Tens of millions of these products are sold annually. In fact, the number of these types of devices shipped is estimated to reach 300 million annually (“Global Wearable Device Unit Shipments” by BI Intelligence).
Today’s wearable devices mostly calculate one dimensional measurements such as counting calories or miles run. By combining multiple sensors, a much more accurate picture of activity can be created and analyzed. Sensor software companies are already demonstrating technology that can provide data on the angles, velocity, and positioning of various body parts, communicated in real time to mobile devices. This complex combination of sensor hardware and software algorithms will become a mainstream feature of next-generation wearable devices.
Another interesting advance in sensor fusion relates to location. The concept of creating a geo-fence, or a virtual perimeter, has existed since GPS became mainstream technology. For example, a geo-fence can be dynamically created around your home or business and combined with a location-based device like a smartphone to make it useful. When the mobile device enters or leaves the geo-fenced area, a notification may be sent to the device (or elsewhere) indicating the event has occurred.
This communication concept is now being enhanced to generate specific messaging to targeted mobile devices based on location. For example, geo-fencing enables a store to know when you have approached a specific section of a store and notifies you about sale items in that area.
Features and applications can be enabled or disabled based on general location. Combining the “coarse-grained” GPS data with more “fine-grained” indoor location protocols such as Bluetooth low energy (LE) or near-field communication (NFC) allows suppliers to provide a more customized experience for the shopper. This is the basic concept behind Apple’s iBeacons and will likely become a standard feature on both iOS and Android devices.
Sensor integration trends
In many of today’s sensor-based applications, the sensor processing is handled “off-chip”. That is, the fusion of the sensor data is done on a separate device (often a microcontroller) with an interface (typically SPI or I2C) to the application processor. Figure 2 shows a typical sensor implementation using discrete components. This example highlights an analog sensor implementation, but digital sensor systems are implemented with similar architectures.
There are good reasons to architect sensor implementations this way – especially in the mobile device space. While mobile applications processors are pushing to 28nm and beyond, the sensor ecosystem is several process technologies behind. For example, the sensors themselves may still be manufactured in an 180nm process while the microcontrollers used to manage the sensor data might be manufactured in a 90nm or 55nm flash-based process technology. Performance is adequate and since low cost is critical, designers have continued to implement discrete devices.
However, the need to provide smaller, faster, lower power systems tends to drive more integration into application processors. As geometries shrink, more transistors can be integrated onto a single die. At some threshold, the area savings and performance boost swing in favor of integrated solutions versus discrete implementations. This trend will ultimately apply to sensor implementations, allowing the sensor logic to act as an on-chip sensor hub, offloading sensor fusion algorithms from the host or applications processor.