DESIGN West: Embedded Vision for Every App

Sylvie Barak, EE Times

March 30, 2013

Sylvie Barak, EE TimesMarch 30, 2013

"Computer vision is the next big thing for embedded systems, making them safer, more responsive to people, more efficient and more preceptive," said Jeff Bier, founder of the Embedded Vision Alliance and session chair of the upcoming Embedded Vision Summit.

In fact, nearly every category of consumer, automotive and industrial application is being enhanced today by embedded vision capabilities. Learn how to add the latest pattern-recognition capabilities into your embedded vision application at the Embedded Vision Summit on April 25, co-located with DESIGN West 2013.

[Click here to register for the Embedded Vision Summit, Thursday April 25th, at the San Jose McEnery Convention Center. See the day's agenda here. The Summit is co-located with DESIGN West.] 

Embedded vision started out as an esoteric technology that was expensive to implement, requiring a team of domain experts with deep experience in the black-magic of pattern recognition to get it right. "NASA pioneered embedded vision for space exploration and the military has been using it for target recognition for decades," said Bier. "But until now it has been a niche technology in industry, such as for parts inspection. Now, the sensors and processors to perform the tens of billions of operations per second necessary to process millions of pixels are much more cost effective, enabling computer vision to be added to almost any embedded system."

Today a wide variety of applications are adding embedded vision capabilities, from automotive systems that avoid collisions by warning drivers, to security systems that detect nervous people acting suspicious, to smartphones that allow users to control video playback using their eye movements.


Using 3-D sensors such as Primesense's Carmine (licensed and popularized by Microsoft as the Kinect for Xbox) has brought down the expense of embedded vision solutions by estimating the distance from the sensor to objects in the scene (pictured). At the Embedded Vision Summit, Texas Instrument's Goksel Dedeoglu will present techniques for low-cost implementation of stereoscopic 3-D vision. SOURCE: TI

"My favorite embedded vision application comes from Affectiva, which uses webcams to detect the emotions of a user," said Bier. "Imagine educational apps that pace learning by detecting frustration levels, or toys that stimulate a child's intellect when they detect boredom. The possibilities are endless, now that embedded vision technology is cheap enough for almost any app."

In fact, as more and more competitors add vision-based pattern-recognition algorithms, a new era of applications are emerging which require that embedded vision be integrated in order to succeed. Unfortunately, many engineers do not realize how useful computer vision can be, nor are they aware of the easy-to-use open-source embedded-vision algorithms that are available to streamline the development process.

"The biggest problem is that engineers are not aware of how useful and relatively easy it is to add computer vision to their embedded systems," said Bier.

< Previous
Page 1 of 2
Next >

Loading comments...

Most Commented

Parts Search Datasheets.com

KNOWLEDGE CENTER