Vision sensor chip works to reduce data clutter -

Vision sensor chip works to reduce data clutter


PARIS In industry 4.0, the data generated by connected cameras are essential to maximize productivity, streamline operations and derive value. What if 99 percent of the data generated by cameras were redundant information, useless for any AI computer vision? And only one percent was pure information, relevant to improve the decision-making process?

Collecting data is one thing. Collecting and processing quality data is another. Event-based vision technology is in good place to change the game as it dynamically captures the most relevant information, in real-time.

Prophesee's Packaged Metavision Sensor
Prophesee’s Packaged Metavision Sensor

Prophesee SA (Paris, France) is rolling out Wednesday what it claims as the first event-based vision sensor chip, called Meta Vision Sensor. Under development for five years, “it has been tested commercially by several customers and clearly shows large integration perspectives within our partners’ cameras,” said Luca Verre, CEO of Prophesee. This announcement comes 18 months after the French startup launched Onboard reference system for vision system developers to try, test, and understand how neuromorphic vision works.

A new sensing paradigm

Back in history, Leonardo da Vinci used the camera obscura as a model of the eye, Eadweard Muybridge devised a complex method of photographing horses in motion, and the Lumière brothers invented the cinema. The common ground between these examples is that they were made for human consumption, not for machine consumption. Prophesee, Verre said, believes that a new sensing paradigm is needed, especially in this new era of artificial intelligence, computer vision and tremendous growth of data.

Inspired by human vision, Prophesee’s technology develops both neuromorphic sensors and machine learning algorithms that mimic the eye and brain. The company advocates an event-based vision approach to sensing and processing that selects the vitality of the scene and overlooks the irrelevant.

In an event-based sensor, Verre explained, each pixel is asynchronous and independent. It is no longer governed by a fixed timing source -the frame clock- but by the variations of the signal in the amplitude domain and records when it senses a change or a movement. The information is not transmitted frame by frame. Rather, movement is captured as a continuous stream of information, and nothing is lost between frames.

This approach, Verre noted, has three advantages. First, “it is acquiring much less data,” enabling reductions of power (<10mW), latency (40-200 µs) and data processing requirements imposed by traditional frame-based systems. Second, he said Prophesee’s sensor achieves pixel acquisition and readout times of milliseconds to microseconds, resulting in temporal resolutions equivalent to conventional sensors running at tens to hundreds of thousands of frames per second. “Is not only about providing much less data, it is about having much more information.” Third, the dynamic range of Prophesee’s vision sensor is over 120 dB, enabling to operate down to 100 millilux of scene illumination.

Industrial applications

Global industrial automation market is expected to reach $321 billion by 2024, growing at a compound annual rate of 6.5 percent between 2018 and 2024, according to Zion Market Research. At the heart of it are smart sensors, which are evolving alongside other technologies, to provide meaningful data and make well-informed decisions. That’s a business opportunity Prophesee is seizing.

Luca Verre
Luca Verre

The new packaged version of Prophesee’s Metavision sensor, Verre said, is aimed at developers of cameras to enable next-generation vision in industrial automation and IoT systems such as robots, inspection equipment, monitoring and surveillance devices. For industrial automation, “our added-value lies in high-speed detection and computation for real-time data transmission as this enables ultra-high-speed counting, vibration measurement and monitoring or kinematic monitoring for predictive maintenance.” Prophesee claims its industrial-grade event-based vision system achieves throughput of over 1,000 objects per second.

In the development of Industry 4.0, energy efficiency overall has taken center stage. It is a matter of costs and continuity. With its sensor, Verre believes Prophesee has a card to play in smart building IoT applications such as presence detection, traffic monitoring and automatic door closing.

Size constraints

Prophesee’s chip, now available in a 13×15 mm mini-PBGA package, integrates the company’s third generation CMOS Image Sensor (CIS) vision module. It features 640 x 480-pixel resolution with 15 µm pixels in 3/4” optical format. Although Verre said these characteristics should “open the way to large integration opportunities,” two key markets are -for the time being- left aside: automotive and mobile.

Asked more specifically about automotive, Verre said it was “a strategic subject for the company,” but the packaged version of the Metavision sensor is now “too big to be integrated in mainstream automotive applications.”

The company is now working on a fourth generation, reducing the size and increasing the resolution to address more mainstream applications such as automotive and mobile phones. Mass production of this fourth generation is expected in 2020.

In parallel, Verre said the company is working with partners on merging its data-driven camera technology with lidars and radars for automotive applications. “These are complementary approaches, and the benefit of our event-driven technology lays in its ultra-high temporal precision and short response time to detect what’s relevant and where to focus attention. Lidars and radars can then conduct an object classification and make the car take the right decision.” More concretely, Verre said Prophesee and its partners are now evaluating a multi-sensor approach that would combine a radar or lidar and, next to it, an event-based camera.

Economies of scale

Prophesee is based on 35 years of research in the field of neuromorphic engineering and has filed 51 patents. Beyond talent and IPs, it has become essential to secure the company’s economic viability.

In 2016, Prophesee (then known as Chronocam) applied the first generation of its sensor to treat people’s blindness at Pixium Vision, a Paris-based retina prosthetic company. Targeting medical applications and niche markets contributed to the validation of the technology. “Our first and second generations of sensors helped us confirm the industrial viability and scalability.” But, Verre continued, “we had to think in terms of economies of scale. If we can’t produce a sensor at a reasonable cost, we take the risk of addressing only niche applications, thus low volumes.” Manufacturing is such a critical subject, he continued. “When we started looking at other applications, we had to make sure we had the right reliability process and test structure in place.”

Foundry partner TowerJazz manufactured Prophesee’s first reference design. Taiwan-based Kingpak Technology Inc. now manufactures the company’s Metavision sensor in a.18µm specialized process.

So far, Prophesee said it has shipped “slightly more than one hundred sensors” to early adopters. Among them, Imago Technologies GmbH (Freiberg, Germany) is developing intelligent vision systems embedding Prophesee’s event-based vision sensor and algorithms. “We also work with partners in China, Japan and the United States on the development of industrial cameras.”

Verre said the company plans to ship a few thousand units this year. In 2020, volumes are expected to increase to tens of thousands or even hundreds of thousands of units.”

Anticipating the next cycle of the company’s development, Prophesee has recently signed five partnerships with global distributors and established a presence in the Silicon Valley, in Shanghai, and in Tokyo. It now employs more than one hundred people.

Opening the discussion, Verre elaborated on the complex and sophisticated aspects of the eye that have not been explored yet. “The eye can, for instance, assess light intensity, color contrasts, as well as preprocess movements and depth. These biologic lessons of the eye are a great source of inspiration as they would add some benefits to our technology.” On the software front, Verre expects brain-inspired computing architectures to emerge soon. Prophesee is currently working with “Intel and IBM on combining an artificial retina with an artificial brain, while “guaranteeing high-speed, low latency and power efficiency,” he added.

>> This article was originally published on our sister site, EE Times: “Event-Driven Vision Hits Production Lines.”

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.