Making mobile & embedded designs more visionary - Embedded.com

Making mobile & embedded designs more visionary

In the late 1990s – just about the time the IPv6 upgrade of the Internet Protocol was made available – a set of software tools algorithms for creating real time embedded vision applications – called OpenCV – was developed by Intel Corp. and donated to the open source community.

Both are now broadly available and changing the way we use computers. But each has traveled different paths to their now increasingly wide acceptance among both the developer community and the broader device user public.

IPv6 was introduced to deal with the rapidly dwindling number of URLs available with the previous IPv4. But it was resisted and took ten years tocome into common use because of the reluctance of most organizations to investin the infrastructure needed. IPv4 URLs are now all used up and there isno choice but to make the shift.

OpenCV, on the other hand, was quickly adopted amongst a small coterie of developers in particular embedded market segments, such as factory automation and military/aerospace where machine vision was critically important.

And now, about ten years later, the pace of its acceptance has rapidly accelerated as a growing number of companies – and developers – see it as the tool set of choice for making embedded computing platforms more userfriendly, not only in mobile devices but in the many new embedded consumerapps in home automation, home networking, lighting, smart TVs, power gridmetering and smart appliances.

The charter of this “embedded vision ” alliance – spearheaded by companies such as AMD, Analog Devices, BDTI, CEVA, Freescale, Intel, Invidia, Mathworks, National Instruments,  Synopsys, Tensilica and Texas Instruments, among others – is to move beyond the current touch-based interfaces.

Where such MEMS and capacitive sensor-based interfaces are simpler thanprevious mouse and GUI-based PC interfaces, they still require that the userlearn how to operate the computing system. Taking a completely differentapproach, the aim in such vision-based designs is to create software mechanisms by which the user does not have to learn how to use the computing device.

Instead the strategy is to build the software infrastructure that willmake it possible for computers to understand us by means of vision algorithmsthat recognize and interpret correctly many common – and innate – human gestures,facial changes, eye movements and other natural cues.

This week’s Embedded Tech Focus Newsletter on “Designing vision apps with OpenCV“ contains some of the recent design articles, white papers, blogs and webinars on OpenCV and vision app design to help you get started.To give you some insight into the challenges involved – and the opportunitiesthey represent – I recommend that you read also:

Gesturerecognition–first step toward 3D UIs?
Howto use hover in a user interface

In addition to the wealth of tools and algorithms available on OpenCV.org and the EmbeddedVision Alliance , there are a number of useful technical papers andconference submissions I have found, of which my Editor’s Top Picks are:

OpenCV Based Real-Time Video Processing Using Android
A GPU-Accelerated Face Annotation System for Smartphones
OpenCV in embedded systems: a performance insight

Several other papers that I found informative and revelatory of the role of OpenCV in nextgen embedded vision designs include: 

Using OpenCV for distance determination
Sophisticated Image Encryption Using OpenCV
Real-time computer vision with OpenCV
Facial Expression Analysis with OpenCV
An OpenCV Algorithm for using a face as a Pointing Device

In this rapidly evolving segment of embedded systems development, new resources are constantly become available. As I come across them I will do what I can to make you aware of them. And if you come across resources in this area that you think are useful, let me know.

Also, let me know about your experiences in the form of blogs or design and development articles you might wish to share with the embedded developer community.

Embedded.com Site Editor Bernard Cole is also editor of the twice-a-week Embedded.com newsletters as well as a partner in the TechRite Associates editorial services consultancy. He welcomes your feedback. Send an email to , or call 928-525-9087.

See more articles and column like this one on Embedded.com. Sign up for the Embedded.com newsletters. Copyright © 2013 UBM–All rights reserved.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.