Neuromorphic vision takes on diverse applications - Embedded.com

Neuromorphic vision takes on diverse applications

Advertisement

Event-based vision company Prophesee has showcased some interesting applications for its vision sensors from around the world, spanning biotech, scientific analysis, robotics and space technologies. One project partially restored sight to a blind person; another tracks space junk across the sky whether it’s night or day.

Prophesee said it has a community of 2200 inventors working with its technology today. The projects are showcased as part of Prophesee’s inventor community which aims to inspire future creativity. Here are the details.

Restoring sight

Gensight Biologics has partially restored visual function in a patient with late-stage retinitis pigmentosa. The treatment uses special goggles equipped with a Prophesee event-based vision sensor.

As part of the therapy, the patient’s retina was treated with light-sensitive proteins derived from algae, which make retinal cells sensitive to amber light (590nm). The Prophesee vision sensor is part of a pair of goggles which convert the vision signal into high intensity amber light which is shined into the patient’s eye. After seven months of using the goggles, the patient (who was formerly entirely blind) could see well enough to detect a pedestrian crossing marked on the street. He was able to identify a notebook on a table in front of him (92% of the time) and a pencil case (36% of the time). He could grab objects and count them.


Gensight Biologics has integrated the Prophesee sensor into goggles which translate images of the world into amber light. Modified cells on the patient’s retina are able to detect this light and make sense of it. (Source: Gensight Biologics / Prophesee)

Patient sees the edges of shapes courtesy of the Prophesee sensor, which provides a fast and smooth visual experience, which is robust to lighting conditions. The sensor generates sparse data; no time is spent on re-encoding images.

Results from the study, carried out in partnership with the University of Basel-UPMC, were published in Nature Medicine. While this work concerned one patient only, and is preliminary, it should be seen as a significant result for applying this technique and treating patients with neurodegenerative diseases and more.

Cell therapy

Cambridge Consultants in the UK has been using the Prophesee sensor in a medical imaging system for cell therapy. Cell therapy holds potential for treatment of cancer, autoimmune and neurological diseases, amongst its applications, based on reprogramming a patient’s own cells. Manufacturing the reprogrammed cells is a laborious, manual process with a single dose costing on average $475,000.


Cambridge Consultants’ PureSentry system uses the Prophesee event-based camera to look for contaminants in medical samples. (Source: Cambridge Consultants)

A key step in the process checks whether samples are sterile. This can take 7-14 days for incubation, while the patient is waiting for treatment. If contaminants are detected, new samples must be taken and the incubation period starts again. Cambridge Consultants used Prophesee’s image sensor to look at samples at the cell level, with AI models to detect, track and classify contaminants in real time (<18ms compared to 7-14 days).


PureSentry uses the Prophesee sensor combined with AI to spot contaminants (in this case, E.coli) based on size, shape and movement. (Source: Cambridge Consultants)

The PureSentry system developed by Cambridge Consultants can track both human cells and contaminants with precision and accuracy ahead of other techniques. The company showed a video of E.Coli bacteria being detected in a sample based on their size, shape, and characteristic motion. It works in more difficult conditions, such as low light and high flow rates and is a non-destructive test. This device can help increase automation in the cell therapy process; highly skilled technicians are not required.

Tracking particles

A team from the University of Glasgow, Heriot-Watt University and the University of Strathclyde have been using Prophesee’s image sensor for high-speed particle detection and tracking. The aim is to make microfluidic analysis faster and cheaper.

The team managed to profile particles down to 1 µm in fluid velocities as fast as 1.54 m/s and capture data at a time resolution equivalent to 20,000 images per second. Their setup uses the Prophesee event-based vision sensor with a standard fluorescence microscope and lighting.


The Scottish team’s optical setup for microfluidic analysis (A) with the Prophesee event-based vision sensor (B). The setup uses a spiral channel with the region of interest marked (ROI) (Source: Prophesee)

Robot touch

Researchers at the National University of Singapore are using event-based vision in combination with a new touch sensor technology to build a sense of touch for robot arms and hands.

The touch sensor developed by the researchers, NeuTouch, is an array of 39 sensors mounted on the robot fingertip. This sensor produces “spike” signals, similar to the Prophesee sensor, which can be processed by a neuromorphic processor (specifically, Intel Loihi).


The National University of Singapore’s NeuTouch sensor is comparable in size to a human finger (A), is composed of parts analogous to bones and skin (B) and contains 39 tactile pixels or “tactels” (C). (Source: National University of Singapore / Prophesee)

This work is important because it demonstrates a way of integrating and extracting meaning from data from multiple sensors (the touch sensor and the Prophesee sensor) at the same time, in a way suitable for complex tasks in power-constrained systems.

The robot was trained to pick up containers with different amounts of liquid; it was able to determine what the liquid was and how much it weighed. The weight information came from the vision sensor because of the way each container deformed when picked up.


The National University of Singapore team with their robot (Source: National University of Singapore / Prophesee)

The robot was tested with the “slip test” – determining the minimum amount of pressure required to grip the object securely. A combination of touch and vision sensing meant the system could detect slips very quickly, 1000x faster than human touch, detecting rotational slip in 0.08s.

The researchers hope this technology may be used for industrial robotics and even to restore a sense of touch for prosthetic hands.

Read EE Times ’ report on this exciting technology here.

Space debris

Western Sydney University has developed Astrosite, a mobile observatory which uses Prophesee technology combined with telescopes to track space debris in orbit around the Earth.


Astrosite telescope observatories ready to be deployed (Source: Western Sydney University)

The growing number of satellites in orbit means the risk of colliding with other objects is increasing; there are around 4850 satellites in space, but only around 40% are active. Monitoring dead satellites and other debris is usually done with high-resolution cameras, which are not ideal for capturing images in daylight. They also capture mostly empty space, resulting in lots of unnecessary data being processed.

Event-based sensing is a good solution since it captures data on changes in its field of view. Astrosite generates 10x to 1000x less data using the Prophesee event-based sensor than a conventional camera system. It can also monitor debris continuously (day and night) with microsecond time resolution, at low power.


Objects in low-earth orbit as tracked by an event-based camera in the Astrosite setup (Source: Western Sydney University / Prophesee)

>> This article was originally published on our sister site, EE Times Europe.


Related Contents:

For more Embedded, subscribe to Embedded’s weekly email newsletter.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.