Looking forward: the future of embedded/mobile UI design - Embedded.com

Looking forward: the future of embedded/mobile UI design

If you had not noticed, gone forever (almost ) are the days of the familiar keyboard touch or mouse/cursor/icon approaches to interacting with your personal computing and communications device or via an embedded human/machine interface.

A wide array of opportunities – and challenges – are now opening up for the embedded/mobile device user interface developer, made possible by breakthroughs in sensor capabilities, multicore processors with high performance computing capabilities, new display technologies and image recognition algorithms.

For example, in the cover story in the December ESD Magazine on the “The challenges of multi-touch gesture interfaces, ” Tom Gray at Ocular LCD outlines the problems developers face in integrating gesture recognition into their new designs. And in a companion article, “Gesture recognition: first step toward 3D UIs?” Dong-Ik Ko and Gaurav Agarwal of TI detail some of the 3D techniques that are beginning to emerge as follow-ups to touch and gesture interfaces.

Some recent design articles, white papers and online seminars and classes that have appeared on Embedded.com addressing how to deal with such design challenges have included:

Adapting UI designs to multiple device display needs
Using projected capacitive displays for durable gesture interfaces
Demonstration of a gesture-controlled media player
A wireless 3D gesture and character recognition system

But there is no rest for the weary embedded systems developer.

In the works are a cornucopia of UI alternatives for how users will relate to the personal computing devices – embedded, mobile or desktop – in the near future. Below is a sampling of recent research papers investigating these alternatives that I have found during a brief search of the World Wide Web, and which are downloadable free in PDF form :

User-defined gestures for surface computing
Adaptive User Interfaces for Web Applications
A UI with semantic tactile feedback for mobile devices
User-defined gestures for connecting mobile phones
A stroke based interface for home robots
Personalized gesture recognition and its applications
A multitouch system for intuitive 3D scene navigation
Deformable displays as input devices

User interface technologies continue to be an exciting and fecund area of design, especially as embedded designs continue to proliferate into every aspect of our lives. I would like to hear from you about your ideas.

Embedded.com Site Editor Bernard Cole is also a partner in TechRite Associates editorial services consultancy. He welcomes your feedback. Call 928-525-9087 or send an email to bccole@techrite-associates.com.

This article provided courtesy of Embedded.com and Embedded Systems Design Magazine. Sign up for subscriptions and newsletters. Copyright © 2011 UBM–All rights reserved.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.