Sensor fusion is a hot topic, coinciding with growth trends for the internet of things and especially connected with autonomous vehicles and advanced driver-assistance systems (ADAS). The concept itself is not new; a search on Google Scholar identifies concepts that date back to the 1960s and earlier. But today, there is a growing body of knowledge around which sensor inputs a system should fuse and how to apply the resulting insight. How much is enough depends on the application and the cost/risk benefits.
How much sensor fusion is enough depends on the application and the cost/risk benefits (Image: SAR Insight and Consulting)
Sensor fusion is more important than ever, as the persistence of individual malefactors and state intelligence agencies poses growing threats to autonomous systems everywhere. While much of the political world frets about potential information security risks in 5G networks, a greater risk arises from malware attacks that can disrupt and extort autonomous-system owners. System architects should not underestimate these risks and must avoid the mistakes of automobile and aviation manufacturers. Ford Motor estimated the worth of a human life at US$200,000 — less expensive than fixing the fuel design of the Pinto sedan (1971–1980). However, personal-injury lawyers and trial juries thought otherwise. The current tale of woe belongs to Boeing, which made critical sensor fusion and redundancy for its 737 MAX jet available for an extra fee. As a result, Boeing and its supply chain suffers through the current agony, the end of which may take years to play out.
Finally, the economic and health benefits from systems that advance sensor fusion for human activity and industrial applications are apparent now.
Fault tolerance and resilience
All sensors and models have a tolerance error, and using multiple sensors that measure the same quantity can increase reliability and provide resilience to failure that could otherwise prove disastrous. Redundancy adds cost and complexity, but as the Boeing and Ford examples show, a short-sighted decision leading to a single point of failure can be catastrophic.
Hackers with malware will find ways to attack sensor-based systems, but the appropriate data fusion and security protocols, including artificial intelligence, can ensure robust operation in the face of such attacks. One means of attack injects false signals to the input sensors; these signals are not mitigated with ordinary digital security because they occur in the analog domain. These attack surfaces can include:
- signal spoofing (LiDAR and cameras);
- signal cancellation and interference (ABS magnetic sensors, vandalized traffic signs); and
- side leakage (implanted malware that uses sensors to harvest sensitive information).
Human activity multi-sensor
Interpreting and monitoring human activity with multiple sensor fusions will achieve better health outcomes and lower costs as populations age. Applications of wearable- and ambient-sensor fusion for human activity include eldercare and assisted living, fall detection and postural recognition, security and surveillance, athlete and first-responder status, and localization and navigation assistance for the impaired.
Data fusion in the network
Data fusion and analytics historically happened on a computer or in a data center (the cloud). The miniaturization and cost reduction of sensing technology enable sensor fusion (and artificial intelligence/machine learning) at the edge-device level. In the future, hybrid network architectures will perform sensor data fusion and analytics at three layers:
- Low-level data fusion will take place on smart devices or the gateways that aggregate multiple sensor inputs.
- Middle-level data fusion will support more intensive analytics and data fusion with a wider range of devices and is associated with a hub gateway and edge computing.
- High-level data fusion will reside in a data center or the cloud to provide the highest perspective of the managed system of edge devices.
Lower operating costs
Sensor fusion will cut operating costs by extending the range and application of devices such as unmanned aerial vehicles (UAVs) and robotics with autonomous features. Savings will also result in cases in which sensor fusion will let remote operators do more, or from low-cost labor centers. Applications include automatic collision avoidance for inspection drones and remote driver intervention for mostly autonomous transportation systems.
Expect continued miniaturization and cost reduction in sensors, computing, and connectivity as consumerization takes root in the industrial and IoT ecosystem. CES 2020 demonstrations included MEMS sensor innovation, such as the miniaturization of LiDAR mirrors that benefit automotive and intelligent transportation systems.
The go-to method for sensor fusion is a class of computer algorithm known as a Kalman filter, which models the previous and current states of a system with continuous measurement and prediction. False signal injection into highly complex systems through the sensor inputs will lead system architects to implement machine learning and neural networks for security and data fusion.
Sensor fusion increases system integrity, reliability, and robustness for normal operation and provides additional benefits against sensor network attacks originating from the analog domain. By carefully implementing sensor fusion into their systems, designers and architects can mitigate the risk from malfunction or malevolent action that can cause injury to people, property, or economic prosperity.
>> This article was originally published on our sister site, EE Times Europe.
Joe Hoffman is director of wireless connectivity and machine sensing at SAR Insight & Consulting, where he focuses on the emerging strategies and shifting value chains of the machine economy. He holds a B.S.E.E. from the University of Kentucky, an M.S.S.E. from Virginia Tech, and an M.B.A. from Arizona State University. His career includes posts at IBM, Lockheed-Martin, Motorola, and Nokia.