Integrating sensor fusion into embedded designs - Embedded.com

Integrating sensor fusion into embedded designs

As embedded wireless devices and mobile platform applications become more sophisticated, the management of sensor inputs has become of critical importance. Typical of the complexity of this task is the nature of the user interface on mobile phones, where capacitive touch 2D user interfaces are being superseded by a range of3D sensor applications designed to allow the device to identify gestures and recognize what they mean.

It is also becoming commonplace for advanced smartphones to also collect information on location using GPS signals and determine device orientation and status from information gathered by integrated 3D MEMS position detectors.

Coming soon will be the ability to identify the location of mobile devices in buildings by a variety of wireless sensors. And with the current enthusiasm about the Internet of Things, consumer device makers are thinking about a whole range of wearable electronic devices and home network sensor apps that collect information about their environment and send it back to a smartphone for analysis and interpretation.

The challenge for developers of the embedded subsystems will be how to manage the massive amounts of sensor information coming in and interpreting it as to context, orientation as well as other factors, and making decisions based on that input. But where on the average 2D touch screen smartphone of a few years ago the designer only had to worry about ten or so sensor inputs, the new application environments will require the ability to manage hundreds of such sensor data streams.

Rich Collins, author of “Sensor fusion enables sophisticated next-gen applications ,” says that to achieve this, developers will have to pay much more attention to more sophisticated sensor fusion methods and algorithms to handle the workload.

In the view of RTI's Supreet Oberoi, author of “Sensor fusion brings situational awareness to health devices,” if this fusion can be achieved and it is possible to consolidate and integrate this data in real time, “we have opportunities to develop new suites of smart applications that can change the way we manage our health, drive our cars, track inventory–the possibilities are endless.”

But he cautions that it will require several new technologies to make this happen, including fusion techniques for acquiring and organizing information and algorithms for situational awareness that will “make the system as a whole and the device acquiring and using that data aware of the specific environment in which that data is to be used.”

Fortunately, a lot of work has been going on to come up with techniques you will need to explore this new application area. Included in this week’s Tech Focus newsletter are a number of recent design articles, technical journal articles, and conference reports on sensor fusion in smartphones, robotics, and wireless sensor collection. In addition, there are a number of other articles that I have found useful in providing a context for this new trend, including:

Sensor fusion and MEMS for 10-DoF solutions
3D Hand Gesture Recognition Based on Sensor Fusion
MEMS sensors for advanced mobile applications
Smartphone-based Location Sensing for Vehicular Navigation

This is an exciting area that greatly expands the opportunities and challenges available to designers of embedded systems, and I will be tracking its developments, looking for papers and conference presentations that provide new tools and techniques to speed up and simplify the process. I also look forward to your contributions to this topic, including comments here and as design articles and blogs you may want to contribute on the tools you have found helpful, new ways to use them, and what new techniques for sensor fusion you have found effective.

Personally, I look forward to the capabilities sensor fusion will add to mobile phones and consumer devices (such as MP3 players) that enrich my life, not to mention the medical devices (such as glucose testers) upon which my life as an insulin-dependent diabetic depends. And a device I can attach to my key ring so my lost keys are findable.

In previous blogs, I have complained that the only portable electronic device I can be reasonably sure of finding is my cell phone, because I can call it up from my house phone and listen for the ring to tell me where it is.

Forget that solution for my MP3 player and my glucose meter, because I can’t call them up. I often put my MP3 down and then can’t find it for as much as a week. So I have several MP3 players – and several glucose testers – stashed in strategic places around the house, so an alternative is available until I find the original.

And then there are the many TV remote controls I have lost and am still finding hidden under chair cushions and in various nooks and crannies in my home.

The optimist in me says that the with device location and monitoring capabilities that sensor fusion technologies will bring to ordinary things in my life, I will be able to stop buying duplicates ofeverything portable, wireless, and untethered.

Embedded.com Site Editor Bernard Cole is also editor of the twice-a-week Embedded.com newsletters as well as a partner in the TechRite Associates editorial services consultancy. He welcomes your feedback. Send an email to , or call 928-525-9087.

See more articles and column like this one on Embedded.com.Sign up for s ubscriptions and newsletters . Copyright © 2014 UBM–All rights reserved.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.