Vendors compete for lead in Robo-Car race

April 04, 2017

junko.yoshida-April 04, 2017

MADISON, Wis. – As more automakers start integrating different sensors into ADAS/autonomous cars, they often justify the decision to apply sensor fusion as “critical to the safety” of highly automated driving.

Often left unsaid, though, are details on the data — raw or processed — they are using and the challenges they face in fusing different types of sensory data. As Ian Riches, director of the global automotive practice at Strategy Analytics, confirmed, “Sensor fusion today is not done on the raw sensor data.  Each sensor typically has its own local processing.”

Mentor Graphics Corp. will come to SAE World Congress in Detroit this week to demonstrate how “raw data fusion” in real time from a variety of modalities – radar, lidar, vision, ultrasound, etc. – can provide “dramatic improvements in sensing accuracy and overall system efficiency.”

Mentor is rolling out an automated driving platform called DRS360, designed to “directly transmit unfiltered information from all system sensors to a central processing unit, where raw sensor data is fused in real time at all levels,” the company said.

Glenn Perry, vice-president and general manager of Mentor Graphics’ Embedded Systems Division, told EE Times, in comparing “sensor fusion” with “raw data fusion,” that there are “subtle but important differences.”

Typically, sensors supplied to automakers come in a module designed to pre-processed data. As Riches explained, “Data sent from a camera to a fusion system, for example, will not be the actual image data, but rather a description of the areas of interest within that image – e.g. here is a white line, here is a car, here is a traffic sign.  Any fusion was then thus done on that much higher-level data.”

How sensor fusion is done today by using processed data from separate sensor modules(Source: Mentor Graphics)Click here for larger image
How sensor fusion is done today by using processed data from separate sensor modules(Source: Mentor Graphics)
Click here for larger image

Complex task
Mentor believes that by getting rid of pre-processing microcontrollers from each sensory module used at each end node and opting for raw data instead, designers of ADAS/autonomous cars can achieve a big boost in “real-time performance, significant reductions in system cost and complexity, and access to all captured sensor data for the highest resolution model of the vehicle’s environment and driving conditions.”

Mentor developed the DRS360 platform to meet the safety, cost, power, thermal and emissions requirements for deployment in ISO 26262 ASIL D-compliant systems.

As Perry explained, the platform includes a Xilinx’ Zynq FPGA for raw data fusion, an SOC (either ARM or X86-based) for ADAS and automated driving functions and an MCU (such as Infineon’s safety microcontroller).

DRS360 offers centralized raw data fusion (Source: Mentor Graphics)
Click here for larger image
DRS360 offers centralized raw data fusion (Source: Mentor Graphics)
Click here for larger image

Phil Magney, founder & principal advisor at Vision Systems Intelligence, told us, “Sensor fusion is a complex task and doing it with RAW data makes it even harder.”

Raw data fusion, Magney explained, enables consolidation of computing resources into a centralized systems more efficiently, despite greater challenges in algorithm integration.  

Continue reading page two on Embedded's sister site, EE Times: "Mentor in Robo-Car race with mobileye, nvidia."

 

Loading comments...

Most Commented