Advertisement

Deep learning machine vision system aids blind and visually impaired

November 01, 2016

Max The Magnificent-November 01, 2016

I just heard about an Italian company called Eyra and its Horus Technology, which combines deep learning, machine vision, and wearable technology to enhance the lives of the blind and visually impaired.

This is of particular interest to me, because my cousin Graham in England started to lose his sight in his early 30s and he's now almost totally blind. Graham is three years older than me, and we used to play together as little lads. In fact, earlier today Graham emailed me the following picture of me (left) and him (right) sitting in the trunk of a Ford Anglia at a family picnic in the Cordwell Valley, Derbyshire, England.


(Source: Max Maxfield & Graham Marshall)

This photograph was taken circa 1959 when I was around 2 years old. Regarding the knotted handkerchief adorning my head, I just got off the phone with my mom who says that my dad would have put it there to protect my noggin from the sun.

I have to say that I have the utmost respect for Graham. He simply doesn’t let anything get him down. Once he could no longer work, he went back to University to get a degree in Communications Studies, after which he landed himself a weekly radio show discussing the latest and greatest music (thereby guaranteeing himself free tickets to all the local concerts). More recently, he's become the co-host of The Live Science Radio Show on Sheffield Live (Graham's show runs for an hour each Saturday morning starting at 11:00 a.m. if you want to listen in).

The reason I'm waffling on about this here is that it provides yet another example of the weird and wonderful coincidences the universe delights in throwing our way. The week before last, I was over in England visiting my dear old mom. As part of this trip, I was invited to give a guest lecture on embedded systems at my old alma mater -- Sheffield Hallam University. Being aware of Graham's interest in science and technology, I invited him to join us and listen in.

The university is growing and boasts a lot of new buildings since my days there. I found it difficult to locate the lecture theater myself, even though I can see where I'm going, but Graham appeared to have no difficulty tracking me down. The point of all this is that, as part of my presentation, we covered cognitive (reasoning, thinking) systems, artificial neural networks, deep learning, and machine vision.

Thus, you can only imagine my surprise when my chum Charlene Gage told me about the folks at Eyra and their Horus Technology (the "Eye of Horus" is an ancient Egyptian symbol of protection, royal power and good health). This consists of a headset connected to a pocket computer. On one side of the headset are two cameras providing stereo vision with depth perception; on the other side is a bone-conduction transducer that allows the pocket computer to talk to the wearer without blocking normal hearing.


The Horus system (Source: Eyra)

The Horus System is powered by an NVIDIA Tegra K1, which is described as: "An otherworldly combination of 192 supercomputer-class GPU cores, incredible graphics horsepower, and extraordinary power efficiency." Using deep learning technology to provide object detection and recognition, the Horus System can analyze, recognize, and inform its wearer as to what it's seeing in the outside world.

As one simple example, the system can recognize and read text, as illustrated in the photographs below.




The Horus system can be used to read text (Source: Eyra)

A more sophisticated example is the system's ability to learn to recognize the wearer's friends and say something like "Hey, Martina and Luca are approaching."


The Horus system can identify people (Source: Eyra)

But all of this is just the tip of the iceberg. The system will also be able to scan the surroundings and transform the visual information into verbal messages. Imagine coupling this with GPS and Google Maps and suchlike, thereby enabling the system to guide its wearer to his or her target location while describing the things they encounter along the way.

Eyra has just launched its Horus Early Access Program for individuals and organizations to test its wearables prior to their public release in January 2017. As part of this program, Eyra is looking for participants in Italy to test the Horus wearable and provide feedback on their experiences that will be used to improve the device.

The folks at Eyra say that they will be soliciting applications from English-speaking candidates later in the year, so I'm hoping Graham gets a chance to use one, in which case I will ask him to report back for me to present his findings in a future column. In the meantime, are there any additional features and capabilities you would suggest Eyra considers adding to the Horus system?

Loading comments...