In the world of mobile platforms and wireless sensor-based embedded designs, new techniques for sensor fusion will be needed to collect data on the location and conditions of everything from smartphones to wearable consumer Internet of Things devices.
A sensor fusion algorithm on a smartphone which combines GPS data with angular velocity information from the gyroscope sensors of the smartphone and an automobile's speed information from the vehicle’s CAN-Bus via a wireless (CAN-BT) adapter.
The objective of sensor fusion is to determine that the data from two or more sensors correspond to the same phenomenon. Before the Internet, the association between two or more sensors would only occur within a single well-defined system. With the Internet, we can now access the signals of several sensors on an ad hoc basis. This capability is useless, however, without some means to determine the interdependencies of the sensors.
Exploring the limits of wireless sensor network coverage based on data fusion models that fuse noisy measurements of multiple sensors and the scaling laws that govern the relationship between coverage, network density, and signal-to-noise ratio (SNR).
A technique for improving Android smartphone orientation estimation through the use of a DNRF (Drift & Noise Removal Filter) that fuses sensor information from the gyroscope, magnetometer and accelerometer to minimize the drift and noise in output orientation.
The design of a wearable smart phone based camera that fuses data from wireless sensor networks , GPS, Bluetooth, and other modalities to provide accurate location positioning.
A sensor fusion algorithm on a smartphone which combines GPS data with angular velocity information from the gyroscope sensors of the smartphone and an automobile's speed information from the vehicle’s CAN-Bus via a wireless (CAN-BT) adapter.
A two-level sensor fusion-based event detection technique for wireless sensor networks that uses a fusion algorithm to gather information and reach a consensus among individual detection decisions made by sensor nodes.
A technique for improving Android smartphone orientation estimation through the use of a DNRF (Drift & Noise Removal Filter) that fuses sensor information from the gyroscope, magnetometer and accelerometer to minimize the drift and noise in output orientation.
The design of a gesture recognition system that combines dynamic hand and arm gesture recognition based on the use of a depth sensor with static hand gesture recognition based on the HD color.
You must verify your email address before signing in. Check your email for your verification email, or enter your email address in the form below to resend the email.
Please confirm the information below before signing in.
{* #socialRegistrationForm *}
{* firstName *}
{* lastName *}
{* displayName *}
{* emailAddress *}
By clicking "Sign In", you confirm that you accept our terms of service and have read and understand privacy policy.
{* /socialRegistrationForm *}
Almost Done
Please confirm the information below before signing in. Already have an account? Sign In.