Using sensors effectively in your Android-based embedded design

Jen Costillo, Rebelbot

February 10, 2015

Jen Costillo, RebelbotFebruary 10, 2015

The exponential growth of Android devices and sensor-based games in the marketplace offers an opportunity for embedded engineers to create great new consumer products with responsive user interactions and applications. To create the next generation of Android devices with the best user experiences, you need to analyze and understand the options in sensor subsystem development.

Evaluating your product path
Before diving into the creation of your next Android device, you must evaluate what your product is trying to accomplish with sensors. Simply put: are you trying to create a “state of the art” or “state of the competition” product? A state of the competition product aims to keep pace with the current crop of products in the marketplace. As it stands today, that encompasses devices with an accelerometer, a gyroscope, a magnetometer, perhaps a light, and/or proximity sensor.

The key “state of the competition” questions are:

  • Is the reference design sufficient for your product goals?
  • Is your development cycle shorter than 1 year? 6 months?

On the other hand, a state of the art product is an innovation-driven product that is attempting to lead the pack, even carving out an entirely new niche. You can consider yourself in this product class when you answer “yes” to the following key questions:
  • Do you have new sensor types?
  • Are you focused on the feature-set or release date?
  • Are money and resources not a limiting factor?

Once you have determined that you are indeed designing with innovation in mind, remind yourselves of the benefits of working with reference designs. Technical challenges can lengthen delivery time but there are other obstacles that tend to delay shipment. For example, customizing beyond the reference board brings new issues with respect to integration, testing, calibration methodologies, and mechanical versus memory footprint design choices that delay product development.

Your Android Universe is summarized in Figure 1. Starting from the bottom up, you can design a sensor subsystem that excites developers and therefore users.


Figure 1: Android sensor subsystem breakdown (Bharadiya)

Innovation starts at customization. In hardware, there are three choices to be made: sensor selection, latency management, and data processing. These early decisions influence software architecture and development schedules. Therefore it is crucial to consider each decision to understand its impact on the product and timeline.

For example, you may need to balance the trade-offs between adding an external part versus implementing the feature in software. Clearly, one affects BOMs costs while the other impacts development costs and testing. You may then decide between buying the software algorithm versus developing one internally. The “buy versus make” decision will repeat throughout the design process.

Understanding the Android framework

Android supports the following physical sensors:
  • Accelerometer – Acceleration in m/s2 along a single axis, often in a 3-axis package.
  • Gyroscope – Angle of rotation in radians/s around a single axis, often in a 3-axis package.
  • Light – Ambient light in lux often used to adjust your screen’s brightness.
  • Magnetic Field – Magnetic field in micro Teslas, often used to indicate orientation relative to the surface of Earth.
  • Pressure – Atmospheric pressure in millibars. Values are used in weather and altimeter applications.
  • Proximity – Distance of an object from the device in centimeters. This sensor is most commonly implemented as a near/far binary sensor. This simplified interpretation is often employed to sense how close your head is during a phone call to avoid face dialing.
  • Temperature – Ambient temperature in degrees Celsius. This sensor is deprecated as most devices are not implemented with sensors that can tell the ambient temperature; rather it is often the temperature sensors within the sensor chips themselves. These values will often differ from the ambient external(SensorEvent).

As you decide on your physical sensors, keep in mind Android’s limited types, sampling rates, and their power consumption. Even if you are using new type of sensor, it is worthwhile to understand Android sensor subsystem architecture design.

Android defines four sampling rates: NORMAL, UI, GAMING, and FASTEST. Depending upon which version of Android your project is using, the four rates can be scattered across 5- 100Hz. With the newer versions of Android (Ice Cream Sandwich and onward) expect these sampling rates to shift upward.

When deciding what sampling rates to use, consider the trade-offs of under- and over-sampling as shown in Figure 2.



Figure 2: Sample rate tradeoffs

Over-sampling rewards you with extra samples leading to a smoother response. To make this improved user experience work, you have to pay in terms of power and processing. You should consider the amount of processing time your entire system has at each level before going this route. On the other hand, under-sampling may payback in a power savings but will probably be penalized in inaccurate or seemingly sluggish response leading to user frustration. Test this in your research phase to find the sweet point before integrating into the full end-to-end system.

Once your set of sensors is selected, consider your sensor data processing strategy and how to channel it to higher Android layers. Earlier reference designs often relayed sensor data directly to the main processor. Today's more advanced reference designs often include a dedicated internal core for sensors. Some sensor vendors are providing solutions on external microcontrollers, such as the MPU-6050 Sensor Board for Atmel AVR UC3 by Invensense (Product Catalog). Your main objectives are to evaluate latency, power consumption, as well as where heavy computations will be located. Finally, remember to factor in firmware update procedures, testing, and ease of manufacturing.

Evaluate your full hardware architecture using the sensor solution equation shown in Figure 3. Your sensor solution is constraint by the longest latency sensor along the most complicated data path, plus the total power consumption of all your sensors and dedicated processors. If you are still missing a stand out solution, consider using tie-breaker criteria such as previous processor family development experience, pre-existing work, cost, and footprint (memory or mechanical).



Figure 3: Basic equation to evaluate a sensor solution

< Previous
Page 1 of 2
Next >

Loading comments...