Android hardware/software design using virtual prototypes - Part 3: Integrating Android’s HALEditor’s Note: In the final part in a Product How-To series of articles on virtual prototyping, Achim Nohl describes how to combine the use of the Synopsys Virtualizer Development Kit (VDK) with Android’s Hardware Abstraction Layer to aid in integrating the sensor further up the device software stack.
Android provides a hardware abstraction layer (HAL) that allows the Android application framework to communicate with the hardware specific device drivers. This well-defined layer eliminates the need to modify the complex application framework when Android is being ported to a new device - ideally only the HAL needs to be updated in order to support the new hardware.
The HAL provides services to the upper layer, the Android application framework. At the same time it uses functions provided by the lower layer Linux device drivers. The HAL defines the API for almost all prominent device functions such as the camera, WiFi, radio, sensors etc. In our case study, we will take a look at the sensor HAL.
A HAL implementation looks similar to a Linux device driver. A defined structure specifies the type of HAL, version ID, and a title string as well as a pointer to the HAL specific access methods. An important method polls new sensor data and is triggered by the Android application framework when an application needs sensor data. In our case, this method needs read access to the pseudo file system provided by the sensor driver. The driver will trigger the sensor control subsystem and deliver the data back to the HAL so it can be used by the Android application.
The HAL module is compiled into a shared object that is dynamically loaded at runtime when needed. Unfortunately, this complicates debugging. The function addresses needed to set breakpoints are not known before the shared object is loaded. By using a VP we have a way to easily address this problem using the runtime scripting capability. The script needed to automatically load the symbol file and relocate it to the correct address will do the following:
- Trigger a scripted callback procedure when the Android dynamic linker has finished loading a shared object, using a breakpoint/callback mechanism in the VP
- Use the callback to determine the address and object file name using the register and virtual memory query functions provided by the VP scripting framework
- Load the respective symbol file from the host and suspend the simulation so that the user can intercept further embedded software execution by setting a breakpoint in the shared object
The VDK comes with whole set of such convenient functions. The Android sensor HAL shared object now can be easily debugged. Android provides some test applications for the HAL and we are going to launch the sensor test using via “/system/bin/test-nusensors”. Thus we can conduct side-by-side debugging of the user space HAL as well as the kernel sensor driver in the stop mode as shown in Figure 11.
Figure 11: Code-bugging Android HAL and Linux Sensor Driver
Jointly debugging user and kernel space is typically a big hassle. As soon as the kernel is suspended, the user space can no longer be debugged due to the fact that user space debugging is typically conducted in run-mode. Here, a debug server application is running on the target OS. If the OS is suspended through debugging the kernel, then the debug server application also is halted and cannot supply the debugger with information. As a consequence, the debugger hangs or exits with a connection time-out. On the other hand, if you do run-mode debugging, the only change you have make is to debug the driver via tracing or the “printk” debug messages. In the case of the VP, the debug server is part of the VP debug infrastructure. The debug server has access to the kernel as well as the user space and can therefore allow side-by-side debugging of the Android HAL and the kernel drivers. It is even possible to also debug the sensor firmware and underlying hardware.
External stimuli coupling the accelerometer with the real world
When using the VP for higher level software development or end-to-end testing, it is important that we supply real world data to the sensor. While it is easy to poke single data values into the sensor subsystem using the VP scripting infrastructure, it is not easy to create data streams that reflect a real scenario such as shaking the device, or moving the device from the table to the ear of the user. However, applications do require such data as they are highly context sensitive. For example, a core function of an application or library could be to detect that a phone is picked up in order to automatically answer the call. Therefore, the algorithm may need to analyze data from the accelerometer sensor, orientation sensor, etc.
Here, a VP can be extended with Virtual IO capabilities as shown in Figure 1. In this case, the data is gathered from a generic real physical device and forwarded to the VP through dedicated APIs for communication with external applications. In this case, we are using a Nintendo WiiMote controller in order to supply data to the sensor subsystem.
Figure 12: WiiMote Axis
An easy to use software library exists for the WiiMote which allow applications to extract the currently read sensor data from the WiiMote controller. Using this library, a small application translates and forwards the sensor data into the sensor controller subsystem by using the external connectivity APIs provided by the VP. With this method, real world data can be supplied to the device under test.
As shown in Figure 13, the standard SDK for Android can be used in order to upload and debug the application using the underlying sensor HAL, device driver, and Cortex-M3 based subsystem.
Click on image to enlarge.
Figure 13: Using the SDK with the VP as a backend
The connection between the SDK and the VP is now established in run mode because the SDK connects (via Ethernet Virtual I/O) to the Android debug bridge (adb), which is an application on the Android file system. While debugging, the user can supply sensor data by simply operating the WiiMote controller.
Sensor debug and test scenarios - record and replay
A problem with external stimuli is the repeatability of scenarios. As an example, a bug is only triggered under circumstances that are very hard to reproduce, such as a specific combination and arrival sequence of different sensor data sets. Using a VP, the stimuli supplied from external sources such as the WiiMote can be recorded and replayed. While recording, a time-stamp is obtained for each data set. This time-stamp corresponds to the time that is simulated on the virtual device and not the time on the wall clock. During replay, the data can be supplied at the exact points in time, as shown in Figure 14.
Figure 14: Scripted Scenario Example
This allows for a fully deterministic repetition of scenarios, which is not only useful for debugging, but also for the automation of complex tests where user input is typically hard to incorporate. Today’s highly context sensitive applications demand the ability to integrate stimuli from the external world (location, sensors, camera) into the use cases needed for testing.
When something goes wrong, the bird’s eye view provided by the VP is again very helpful as shown in Figure 15. Here, the Linux threads are traced along with the Android activities and the log messages coming from Android, the kernel, and even the sensor control subsystem if needed.
Figure 15: VP Analysis for Android
The era of serialized hardware and software development–where the vast majority of software is developed and verified after the silicon design is complete–is declining in favor of new methods for early software development giving embedded developers access to the low level capabilities required to develop software earlier, achieve greater product quality, and exceed time-to-market requirements for the constantly changing handset market.
Part 1: Why Virtualize
Part 2: Building Android’s multifunction sensor controller subsystem
Achim Nohl is a technical marketing manager for Virtual Prototypes at Synopsys. He publishes technical articles, conference papers, and blog posts about developing embedded software using virtual prototypes. His specialty is bringing up firmware, driver,s and operating systems for the most recent CPUs and IPs. Achim holds a degree in Electrical Engineering from the Institute for Integrated Signal Processing Systems at the Aachen University of Technology, Germany. Before joining Synopsys, Achim worked in various engineering and marketing roles for LISATek and CoWare.
1. Android SDK
3. ARM Versatile Express
5. HAPS Family of FPGA-Based Prototyping Solutions
7. TI OMAP
8. Virtual Prototype Creation and Development Kits