Designers of automotive computersunderstand that drivers can only look at a dashboard display for afew seconds at a time. Since the operator of the automotivecomputer is also likely to be driving, automotive computers raisesafety issues new to the computing world.
When creating graphical user interfaces (GUIs) for embeddedon-board systems, you must consider a number of evolving designissues. For the driver to be able to glance quickly at the screen,displays must be unambiguous, obvious, and limited in number. Theframework of control layers should be simple so that drivers don'tget lost in the structure.
Line drawings and simple window outlines frequently used inembedded displays won't be enough to give your GUI a unique,customized look and feel.
First-tier automotive suppliers require the ability to useprofessionally-produced, custom graphics and distinctive fonts inorder to help distinguish their product brands and create apolished look and feel. Display-enhanced embedded technology letsyou write application behavior once and adopt the application'sspecific appearance to brand-oriented themes, such as with the”skins” concept currently implemented in today's MP3 musicplayers.
Also critical in automotive interface design is the ability touse quick and responsive modes of user input, such as touch screenand voice technology.
In order to create functional, marketable embedded automotiveapplications with GUIs that use dashboard displays, you must beable to successfully address all of these issues.
Until recently, no existing object-oriented Java GUI frameworkadequately addressed these embedded requirements. However, a GUIframework now exists specifically to provide developers with aclassically-object-oriented way to create graphical interfaces forembedded Java applications.
Structure of an OO GUI Framework
One example of a basic building block for any GUI system is theApplication class in IBM's VisualAge Micro Edition MicroViewframework. The Application class creates and lays out screensconsisting of any number of View/Controller pairs that renderapplication data on the display device. The data to be presentedthrough the View is defined in a Model class, which keeps theapplication itself relatively independent from the presentation ofthe data. Data to be presented to or from the user populate theseModels, which control the state of Views.
Figure 1 : The events that take place between the time theobject is placed on the screen and viewed by the individual to theindividual's response and the screen update.
The ApplicationManager class coordinates the user's navigationof various modular, developer-defined Applications, which are keptin its registry. ApplicationManager switches Applications on andoff the screen as needed, and coordinates the creation of Views forall the Applications in the device.
MicroView Applications are always associated with a View that isan instance of MicroViews ContainerView class; this View in turncontains other Views that make up a visual layout of theApplication. The ApplicationManager also contains aScreenApplication, which is the root object from which the rest ofthe Applications extend.
In order to build a GUI system, you must first define thenecessary Application classes, as well as all required Model, View,and Controller objects (Models control the state of Views). Model,View, and Controller objects are defined by creating new instancesof default MicroView framework classes.
For complex GUI systems, you can also create customized Model,View, and Controller classes. Default Views include Button View,LabeiView, ListView, and Paragraph View, among others. DefaultControllers include ButtonController, KeypadController, andMenuController.
In addition to creating View/Controller pairs and positioningthem on the device's screen, each Application also implements thenecessary Listener interfaces to respond to user input events thatare sent to those Control/er classes with actions (i.e., actionsthat change the states of Model and View classes). These actionsare then passed as messages to the Application.
Figure 2 : An example of a customized view and layout ofan automotive audio and Internet system.
A MicroView application, like most common GUI systems, uses anevent-based mechanism to guide interaction between the user and theapplication. The MicroView event system is fundamentally similar tothe delegation-based event model exposed in the AWT library as partof the standard Java Development Kit (JDKTM).
In the MicroView event-handling model, instances of Applicationclasses implement listener interfaces for each event type they needto handle. When an event that has a registered listener comes infrom the underlying input system, the event is routed to that classfor processing.
In the standard AWT delegation-based event model, an event ispropagated from a source object to a listener for processing inresponse to some user interaction. In the MicroView framework (asin the standard AWT delegation event model), an event source istypically a UI component, and the listener an Application objectthat implements the appropriate listener interface based on thespecific needs of the application.
In the MicroView implementation of the Model/View/Controller(MVC) paradigm, the Controller acts as intermediary between theApplication and the underlying event subsystem; thus, event typesin the MicroView framework are either direct or indirectdescendants of the com. ibm. iveodegas. ControllerEvent class.
Each event is created from its source and routed by the inputcomponent subsystem to the Application object that plays the partof the listener, along with the instance of the appropriate eventbased on the specific type of user interaction requested.
By virtue of an implementation of method or methods specified inthe corresponding listener interface, the listener object (in thiscase, the Application object) presumably then responds to the eventin some meaningful way. At this point, the input component issignaled by a return value as to whether or not this particularlistener object consumed this event, and the input componentcontinues to respond to user interaction, repeating this cycle foreach event that is considered interesting.
Again, the base class for event objects in the MicroViewframework is the com. ibrn.ive, degas. ControllerEvent class. Thisclass is a direct descendant of java. util. EventObject, which isthe base event object in the standard AWT delegation event model.The MicroView framework provides the following four basic eventtypes:
- ButtonEvents—sent when the user clicks or taps on aButton View user interface component
- ListEvents—occur when the user taps a selection in a ListView component
- MenuEvents—user has made a selection from a menu
- KeypadEvents—result in the user making a selection from acustom view that emulates a keyboard.
MicroView does not contain specific View objects that representmenus or keypads. It does, however, provide custom Controllerobjects that can be used with custom View objects to mimic theappearance and/or functionality of these objects.
The MicroView event listener framework corresponds loosely tothe standard event hierarchy defined above. The following listenersexist for each type of standard event:
- ButtonListener, which defines a single method,
public Boolean handleEvent(ButtonEvent e)
- ListListener, which defines a single method
public Boolean handleEvent(ListEvent e)
- MenuListener, which defines
public void handleEvent(ListEvent e)
- KeypadListener, which defines the method
public void handleEvent(KeypadEvent e) .
Like AWT, the MicroView framework makes a distinction betweenlow-level and high-level (often referred to as “semantic”) events.A low-level event is characterized as an input or other windowsystem event, whereas a semantic (i.e., high-level) event isusually the result of some semantic of the component model itself;in other words, component-to-component messages that do not requireuser intervention.
While the MicroView framework contains low-level events, theirfunctionality is almost fully encapsulated in the low-level eventhandling framework code and is not exposed to the developer at anAPl level. Common MicroView events used by the applicationdeveloper are semantic in nature.
Like Aw'r, event delivery in the MicroView framework issynchronous; events are delivered to components by the input systemin the order in which they are received.
Although MicroView defines a basic set of events and listeners,you are free to implement your own custom event types and listenersas part of the semantic interface of the application.
Another difference between MicroView and AWT is that theMicroView developer is safe in assuming that all event listenerdispatching will take place on the same thread. However, this isdue to the implementation of the underlying input subsystem and isnot strictly a design function of the event system.
Addressing Automotive-Specific Issues Instead of restricting yourself to creating code-drawn GUIs, youcan use MicroView to create Views using your choice of bundledbitmap images and default or custom fonts.
Existing bitmap artwork can be imported from professionalgraphics software such as Adobe Photoshop to create icons,backgrounds, operating widgets (e.g., pushbuttons), and thelike.
A subclass of a default EgBitmapBundle class serves asthe representation for the bitmaps. You then convert the subclassesto a ROM resource format, give them a physical file namedefinition, and associat them with an ID that can be accessed fromthe application. To use either default or custom fonts in adisplay, follow a similar procedure, using an instance of theEGBitmapFontsBundle class and specifying various sizes andstyles for bundled fonts.
This flexibility facilitates the design of highly-refined,brand-focused, distinctive displays. The ability to create artworkusing professional graphics software also allows for more rapiddevelopment and minimizes the required amount of Java code in theapplication.
Figure 3 : Example of a GUI that uses MicroView to createbundled bitmap images from graphics programs like Adobe Photoshop,instead of using code-drawn images.
Events utilize various modes of user input, including touchscreen and voice data. Several events can be linked to the sameView/Controller pair when defining Application behavior. Receivingevent notification from the Views contained within an Applicationsimply requires that you implement the appropriate listenerinterface. An example of event handling is when the data in a Viewchanges, causing the Model to change, or when the user touches ascreen area.
For example, if an Application contains Button Views and theApplication class needs to receive event notifications from theseviews, the Application should implement the ButtonListenerinterface. Similar listener interfaces utilizing voice and otherinput modes can also be used (e.g., ButtonListener, ListListener,KeypadListener, MenuListener, etc.).
There are two approaches when controlling a user interface withspeech. The first is to use voice data to control the screeninterface itself. For example, voice input is used to switch fromone field to another, activate a button, activate a menu, choose anitem in a list, etc.
Due to the particular difficulties involved in simultaneouslyoperating a vehicle and viewing and manipulating a touch screen,you may find it useful to think of designing an interface forvisually challenged or sightless users. For sight-impared users aswell as for dashboard automotive interfaces, voice interaction andspeech recognition are key aspects of GUI development.
Several European countries are now considering laws requiringthat all newly-developed devices provide accessibility for userswith disabilities, including visual impairments. Speech and voicedata have the potential to become even more essential andwidespread in the future than they are today.
A second approach to controlling a user interface with speechmay be preferable in automotive and other GUI applications. In thisapproach, two user interfaces are created: voice, and a graphicaltouch panel. The user has direct interaction with the applicationthat uses voice recognition and feedback, but if voice input isused to interact with the application, the screen does not react inthe same manner as it does when you are using touch data.
For example, to look up an address, an application might usespeech to prompt the user for each field of data (e.g., city,state, etc.). Several back-and-forth application interactions fillout a “form” with the information the user provides via voiceinput. Audible feedback is provided for each response, and the userdoesn't have to look at the screen to confirm the application'sunderstanding of the speech input. The voice interaction changesthe Model (i.e., internal data), and the View automaticallyreflects the changed data.
O0 GUI Framework is equally usable in touch pad-based andspeech-based interfaces. The MicroView event framework is afamiliar, consistent event-handling model that provides maximumimplementation of core functionality for the embedded systemsdeveloper.
Auditory User Interfaces: Toward the Speaking Computer T.V. Raman, Kluwer Academic Publishers, 1997.