Using sensors effectively in your Android-based embedded design - Embedded.com

Using sensors effectively in your Android-based embedded design

The exponential growth of Android devices and sensor-based games in the marketplace offers an opportunity for embedded engineers to create great new consumer products with responsive user interactions and applications. To create the next generation of Android devices with the best user experiences, you need to analyze and understand the options in sensor subsystem development.

Evaluating your product path
Before diving into the creation of your next Android device, you must evaluate what your product is trying to accomplish with sensors. Simply put: are you trying to create a “state of the art” or “state of the competition” product? A state of the competition product aims to keep pace with the current crop of products in the marketplace. As it stands today, that encompasses devices with an accelerometer, a gyroscope, a magnetometer, perhaps a light, and/or proximity sensor.

The key “state of the competition” questions are:

  • Is the reference design sufficient for your product goals?
  • Is your development cycle shorter than 1 year? 6 months?

On the other hand, a state of the art product is an innovation-driven product that is attempting to lead the pack, even carving out an entirely new niche. You can consider yourself in this product class when you answer “yes” to the following key questions:

  • Do you have new sensor types?
  • Are you focused on the feature-set or release date?
  • Are money and resources not a limiting factor?

Once you have determined that you are indeed designing with innovation in mind, remind yourselves of the benefits of working with reference designs. Technical challenges can lengthen delivery time but there are other obstacles that tend to delay shipment. For example, customizing beyond the reference board brings new issues with respect to integration, testing, calibration methodologies, and mechanical versus memory footprint design choices that delay product development.

Your Android Universe is summarized in Figure 1 . Starting from the bottom up, you can design a sensor subsystem that excites developers and therefore users.

Figure 1: Android sensor subsystem breakdown (Bharadiya)

Innovation starts at customization. In hardware, there are three choices to be made: sensor selection, latency management, and data processing. These early decisions influence software architecture and development schedules. Therefore it is crucial to consider each decision to understand its impact on the product and timeline.

For example, you may need to balance the trade-offs between adding an external part versus implementing the feature in software. Clearly, one affects BOMs costs while the other impacts development costs and testing. You may then decide between buying the software algorithm versus developing one internally. The “buy versus make” decision will repeat throughout the design process.

Understanding the Android framework

Android supports the following physical sensors:

  • Accelerometer – Acceleration in m/s2 along a single axis, often in a 3-axis package.
  • Gyroscope – Angle of rotation in radians/s around a single axis, often in a 3-axis package.
  • Light – Ambient light in lux often used to adjust your screen’s brightness.
  • Magnetic Field – Magnetic field in micro Teslas, often used to indicate orientation relative to the surface of Earth.
  • Pressure – Atmospheric pressure in millibars. Values are used in weather and altimeter applications.
  • Proximity – Distance of an object from the device in centimeters. This sensor is most commonly implemented as a near/far binary sensor. This simplified interpretation is often employed to sense how close your head is during a phone call to avoid face dialing.
  • Temperature – Ambient temperature in degrees Celsius. This sensor is deprecated as most devices are not implemented with sensors that can tell the ambient temperature; rather it is often the temperature sensors within the sensor chips themselves. These values will often differ from the ambient external(SensorEvent).

As you decide on your physical sensors, keep in mind Android’s limited types, sampling rates, and their power consumption. Even if you are using new type of sensor, it is worthwhile to understand Android sensor subsystem architecture design.

Android defines four sampling rates: NORMAL, UI, GAMING, and FASTEST. Depending upon which version of Android your project is using, the four rates can be scattered across 5- 100Hz. With the newer versions of Android (Ice Cream Sandwich and onward) expect these sampling rates to shift upward.

When deciding what sampling rates to use, consider the trade-offs of under- and over-sampling as shown in Figure 2 .

Figure 2: Sample rate tradeoffs

Over-sampling rewards you with extra samples leading to a smoother response. To make this improved user experience work, you have to pay in terms of power and processing. You should consider the amount of processing time your entire system has at each level before going this route. On the other hand, under-sampling may payback in a power savings but will probably be penalized in inaccurate or seemingly sluggish response leading to user frustration. Test this in your research phase to find the sweet point before integrating into the full end-to-end system.

Once your set of sensors is selected, consider your sensor data processing strategy and how to channel it to higher Android layers. Earlier reference designs often relayed sensor data directly to the main processor. Today's more advanced reference designs often include a dedicated internal core for sensors. Some sensor vendors are providing solutions on external microcontrollers, such as the MPU-6050 Sensor Board for Atmel AVR UC3 by Invensense (Product Catalog). Your main objectives are to evaluate latency, power consumption, as well as where heavy computations will be located. Finally, remember to factor in firmware update procedures, testing, and ease of manufacturing.

Evaluate your full hardware architecture using the sensor solution equation shown in Figure 3. Your sensor solution is constraint by the longest latency sensor along the most complicated data path, plus the total power consumption of all your sensors and dedicated processors. If you are still missing a stand out solution, consider using tie-breaker criteria such as previous processor family development experience, pre-existing work, cost, and footprint (memory or mechanical).

Figure 3: Basic equation to evaluate a sensor solution


Integrating your driver into the Android kernel
One of themost mundane tasks in your design and development process is includingthe kernel drivers you may need. Consider either a peripheral driversuch as I2C or SPI, or an inter-processor interface driver. The latteris often included in your reference design, which can save you severalweeks of development. However, if you selected an external dedicatedprocessor, you will need to spend time to support the new interface.

Dependingon your hardware architecture, you may need to alter the sensorhardware abstraction layer (HAL) and/or libraries section. Leveragingsample code from /device/Samsung/crespo/libsensors will unearth for the following:

  • For each sensor within the system a new sensor class instance must be created and listed within SensorList.
  • Each sensor requires its own implementation of readEvents(), hasPendingEvents(), setDelay(), Enable()

In addition, the HAL is a popular place for sensor fusion and virtual sensors but there are other locations.

Sensor fusion

Sensorfusion tends to be overused and misunderstood. The term refers to anyinstance when two or more disparate sensors are combined to produce amore accurate, robust, or dependable device than can beachievedindividually (sensor fusion). For example, the algorithm that combinesaccelerometers, gyroscopes, and magnetometers to create deviceorientation and movement is a sensor fusion component.

Sensorfusion is commonly used in Android for gesture detection and evencalibration. Interestingly, sensor fusion can be introduced in anylocation within your data path. Again, you will need to evaluate whetherto make its own algorithms, buy from another vendor, or utilize theones in the reference design. Next, you’ll need to determine where toexecute these algorithms.

Understanding virtual sensors

Formost Android application developers, the first experience with sensorfusion starts with virtual sensors. One place where virtual sensors canbe found is in the sensor service or daemon. The reference Android codelocated in /frameworks/base/services/sensorservice, provides a goodexample of how the service and virtual sensors are created in Android.This is an important reference for creating your own implementationswhether they are located inside this service or at the sensor HAL layer.

Aquick run through of Google’s built-in virtual sensors is presented inFigure 4. Notice that gravity will indicate what direction gravity ispulling on the device while linear acceleration will denote only actualdevice acceleration without gravity. While other implementations oforientation may include a compass, Google’s early version of thesevirtual sensors only utilizes the accelerometer and gyroscope.Similarly, the compass and accelerometer yield rotation. Newer versionswill leverage more sensors or including other virtual sensors.

Figure 4: Virtual sensor combinations

Despitethese basic combinations, there are some subtle challenges with virtualsensors. The most important one is “garbage in, garbage out:” yourcompound sensor is more likely to compound errors rather than cancelthem out(Lim, 2012). You may also introduce latency issues. Moreover,some virtual sensors utilize other virtual sensors and there are nosystem requirements detailing how to synchronize disparate samples.

Reducing sensor errors
Muchhas been written already about Android sensor APIs at the applicationlevel (SensorManager). As device makers, it is more important tounderstand the issues faced by consumers of your data and how to limittheir impact. The big issues are typically related to problems with poorcalibration or environmental interference. Table 1 shows common errorsand development solutions.

Conclusion
In a sea ofsimilar Android devices, adding a unique new sensor to your Androiddevice often rewards in an enhanced user experience, and in an improvedbattery life. Although creating a “State of the Art” product often seemsas straying off the beaten path, understanding development tradeoffsand making calculated, intelligent design choices, can help in bringingthis engineering challenge to a satisfying successful product—for bothdeveloper and end user.

References
Bharadiya, P. (n.d.). Android Sensor PortingGuide . Texas Instruments Embedded Processors Wiki.

Lim,J. (2012). “Design Considerations for Motion Interface Applications.”Motion Developers Conference (p. 23). San Francisco: Invensense.

Invensense Product Catalog

Sensor Fusion , Wikipedia

SensorEvent , Android Developers Reference

SensorManager , Android Developer's Reference

A veteran firmware engineer at Rebelbot, Jen Costillo has been working in consumer embedded systems for over 15 years andspecializes in products involving sensors. She has worked on networkequipment, video set top boxes, biometric readers, as well as severalhuman interface devices ranging from air mice to multi-touch devices andhuman gesture musical instruments. Her educational background includes aBS in Computer Engineering from University of Illinois-Urbana and MBAfrom San Jose State University. Her on-going projects include humangesture recognition to generate music in real-time and electronicwearables. This paper was presented as a part of a class she taught atthe Embedded Systems Conference on Android Sensors: Top to bottom(ESC-305).

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.