Event-driven lidars seek tighter focus on relevant info - Embedded.com

Event-driven lidars seek tighter focus on relevant info


PARIS — In the fast-growing markets for factory automation, IoT, and autonomous vehicles, CMOS image sensors appear destined for a role capturing data not for human consumption but for machines to see what they need to make sense of the world.

CMOS image sensors “are becoming more about sensing rather than imaging,” said Pierre Cambou, activity leader, MEMS & Imaging at Yole Développement. The Lyon, France-based market research and technology analysis company boldly predicts that by 2030, 50% of CMOS image sensors will serve “sensing” devices.

Luca Verre

Luca Verre

Paris-based Prophesee SA (formerly known as Chronocam) styles itself as a frontrunner in that revolution. A designer of advanced neuromorphic vision systems, it advocates an event-based approach to sensing and processing. Prophesee’s bio-inspired vision technology has been deemed too radically different from conventional machine vision — and perilously “ahead of its time.” But Luca Verre, co-founder and CEO of Prophesee, told us that this is no longer the case.

In a one-on-one interview here, Verre said that his company has secured its Series B plus funding (the startup raised $40 million in funding in the last three years). It now has a partnership deal with a large unnamed consumer electronics company. Most importantly, Prophesee is now advancing its neuromorphic vision system from the usual technology concept pitch to promoting its reference system for tinkering by developers.

Prophesee’s first reference design, available in VGA resolution, consists of Prophesee’s Asynchronous Time-Based Image Sensor (ATIS) chip and software algorithms. The ASIC will be manufactured by a foundry partner in Israel, said Verre — most likely Tower Jazz.

The company declined to detail its ASIC and the specification of the reference design. Prophesee said that it is planning on a formal product announcement in several weeks.

Nonetheless, the startup reached a milestone when the reference design proved able to offer system designers the opportunity to see and experience just what an ATIS can accomplish in data sensing. The ATIS will be characterized by its high temporal resolution, low data rate, high dynamic range, and low power consumption, said Prophesee.

Cameras are bottlenecks
Makers of cameras for machine-vision systems — whether in smart factories, IoT, or autonomous vehicles — have begun to heed the event-based approach promoted by Prophesee’s co-founders such as Ryad Benosman and Christoph Posch.

With all of the detailed visual information that traditional cameras can capture, “the camera has become a technology bottleneck,” said Verre. Unquestionably, cameras are the most powerful sensing device. Yet for visual data in automation systems, surveillance cameras, or highly automated vehicles, cameras could slow down the processing.

Consider self-driving cars, said Verre. The central processing system inside the vehicle is bombarded with data from cameras, lidars, radars, and other sources. The key to manage this overload is figuring out how best to “reduce the amount of raw data” streamed from sensors. The sensors should only capture data that matters to “a region of interest,” said Verre.

As Prophesee explained in past interviews with EE Times, the company’s event-driven vision sensors are inspired by biology. This perception derives from the co-founders’ research on how the human eye and brain work.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.