PARIS — Prophesee, a Paris-based startup that has pioneered neuromorphic vision systems, presented this week at the International Solid-State Circuits Conference (ISSCC) in San Francisco a new, stacked event-based vision sensor jointly developed with Sony Corp.
Designed by Prophesee’s event-driven technology, the new sensor was built on technologies engineered by Sony for advanced stacked CMOS image sensors.
Pixel chip (left) and logic chip (Source: Prophesee)
For event-driven systems, the new sensor offers the industry’s smallest pixel size and the industry’s highest high-dynamic range (HDR) performance, Prophesee claimed. The brain-inspired sensor would allow industrial machines, robots and autonomous vehicles to see and sense the environment better.
The partnership could herald a new era in which AI — both AI sensing and AI processing — could take place very close to the sensor, if not yet on the sensor itself, where data is generated.
Luca Verre, Prophesee CEO
What’s in it for the two companies?
Sony is the world’s leading CMOS image sensor company. Its partnership lends commercial credibility to Prophesee’s new sensing technology. Luca Verre, CEO of Prophesee, hopes the move will open the door for Prophesee’s event-based cameras to mass-market opportunities.
“Given the nature of our technology that uses a lot of transistors, electronics and photodiodes, working with Sony was always our first choice,” Verre told EE Times. The companies started to work together in 2017. Verre promised that samples of the first stacked event-based vision sensors will be available in 2020.
Partnership with Prophesee could prove to be a game-changer for Sony, giving the Japanese behemoth a chance to explore event-driven vision technology.
Pierre Cambou, principal analyst at Yole Développement, views the Prophesee-Sony partnership as equivalent to “opening Pandora’s box.”
In the neuromorphic camera developed by Prophesee, “Each pixel, in essence, is a neuron,” noted Cambou. Calling chip stacking development “still in its early phase,” Cambou predicted that down the line, neurons, memory and processing could be all stacked together. Current enthusiasm about bringing more intelligence to “edge” or “endpoint” foreshadows a day in the future when AI is sensed and processed inside the sensor.
Sony will be able to develop an AI-integrated image sensor that “can enhance images on the fly,” or create an image sensor that “directly offers event-based sensing,” Cambou explained.
In fact, someone has already applied neural networks to “Arrival of a Train at La Ciotat” — a famous early film shot by the Lumiere Brothers in 1896 — to fill in missing pixels. That effort made it possible to revive and upscale the 124-year old film to 4K video in 60 frames per second. Describing such upscaled video as all the rage on the Internet, Cambou explained that the power of AI has been well proven in the imaging world.
Biomimicry approach to AI is an alternative to von Neumann computing. “It will increase its importance as AI becomes more widespread,” said Cambou.
“Arrival of a train at La Cicotat,” originally shot by the Lumiere Brothers in 1896, was upscaled to 4K, 60 frames per second, by Denis Shiryaev
Building blocks for stacked event-based vision sensor
In the newly designed event-driven sensor, the pixel chip (top) and the logic chip (bottom) are stacked together. The logic incorporates signal processing circuits designed to detect changes in luminance based on an asynchronous delta modulation method.
Each pixel of the two individual chips is electrically connected using copper-copper connection in a stacked configuration.
This allows the new sensor to achieve the industry’s smallest (4.86μm) pixel size. It also delivers 1/2 type, 1280×720 HD resolution by achieving high density integration with a fine 40nm logic process.
Another feature of the new event-driven sensor is its ability to offer what Prophesee claims is the industry’s highest HDR performance: 124 dB (or more). The team accomplished this by placing only back-illuminated pixels and a part of N-type MOS transistor on the pixel chip (top). This enhances the aperture ratio by up to 77%, according to Prophesee. High sensitivity/low noise technologies Sony has developed over the years for CMOS image sensors made event detection possible in low-light conditions (40mlx).
Division of labor?
Prophesee CEO Verre noted that Sony’s role in the partnership is not limited to a foundry. With solid knowledge and an IP portfolio of its own in event-based technology, Verre said, “We came to the table with a clear idea as to how we wanted to design pixels.” The two companies’ collaborative efforts made possible the new event-driven sensor design — from analog, logic and power — combined with Sony’s advanced manufacturing process.
The relationship isn’t a pure R&D exercise either, noted Verre. The intention is to maintain a long-term partnership that will bring the technology to the commercial market while developing applications.
What does this mean to Sony?
The world has many players in CMOS imaging sensors. Cambou observed that Sony took to heart the prediction that CMOS image sensors are “no longer just about imaging but really about sensing” — a gospel Cambou has long been preaching.
Remember Sony’s 2015 acquisition of Belgium-based SoftKinetic, a developer of 3D sensing computer vision technologies? Cambou said that acquisition enabled Sony to carve out a share in the growing market for time of flight (ToF) sensors that Android smartphone vendors suddenly started demanding last year. Imagine what would have happened to Sony had the company not dabbled in the depth sensing technology then, said Cambou.
In his opinion, the partnership with Prophesee is no different. When players in the automotive and industrial machine vision markets start looking for richer sensing data to enhance their operations, they are more likely to add a new sensing technology, instead of throwing in yet another identical camera.
In short, Prophesee could be the company empowering Sony to enter the event-driven image sensor market, just as SoftKinetic made it possible for Sony to develop “DepthSense” solutions and advance them into smartphones.
Now, with its DepthSense technology, Sony is fitting ToF image sensors into many Android smartphones. This adds to the host of CMOS image-sensors design wins Sony has already amassed in the front and rear facing cameras of the smartphones.
>> This article was originally published on our sister site, EE Times.