Automated vehicle future faces unanswered sensor questions
PARIS — Today, there’s no shortage of questions for executives and engineers at tech and auto companies grappling with the technology and business roadmap of automated vehicles (AVs). Three big unanswered questions, however, stand out.
Egil Juliussen, director of research for Infotainment and advanced driver-assistance systems (ADAS) for automotive at IHS Markit, laid out the following as the “unanswered questions” that will dog the auto industry in 2019:
Do we really need lidars?
Are tech/auto companies really ready to collaborate in pursuit of “ network effect ” for advancements of driving software?
Will the industry solve the L2 to L3 handover problems?
Industry observers certainly see a new round of AV partnerships percolating among tech companies, tier ones and car OEMs. And several companies are trying out new technologies, such as ADAM, on the L2 to L3 handover quandary.
Speaking of that unimaginable dilemma for human drivers when machines suddenly give control back to them, “expect the resurgence of interest in driver monitoring systems among tier ones and OEMs at the 2019 Consumer Electronics Show in Las Vegas next month,” Colin Barnden, Semicast Research lead analyst, told us.
But, will ADAS cars and robocars really need lidars? Juliussen told us, “We are beginning to hear this a lot.” The issue follows the emergence of digital imaging radars “that can do a lot more than they used to,” he explained.
AEye to fuse camera/lidar data
Against this backdrop, a startup called AEye, based in Pleasanton, Calif., announced last week its first commercial product, “iDAR,” a solid-state lidar fused with an HD camera, for the ADAS/AV market.
The idea of autonomous vehicles without lidar has been floating around the tech community for almost a year. The proposition is tantalizing because many car OEMs regard lidars as too costly, and they agree that the lidar technology landscape is far from settled.
Although nobody is saying that a “lidar-free future” is imminent, many imaging radar technology developers discuss it as one of their potential goals. Lars Reger, NXP Semiconductors’ CTO, for example, told us in November that the company hopes to prove it’s possible.
AEye, however, moves into the is-lidar-necessary debate from another angle. The startup believes that car OEMs are reluctant to use current-generation lidars because their solutions today depend on an array of independent sensors that collectively produce a tremendous amount of data. “This requires lengthy processing time and massive computing power to collect and assemble data sets by aligning, analyzing, correcting, down sampling, and translating them into actionable information that can be used to safely guide the vehicle,” explained AEye.
But what if AEye uses artificial intelligence in a way that discriminately collects data information that only matters to an AV’s path planning, instead of assigning every pixel the same priority? This starting point inspired AEye to develop iDAR, Stephen Lambright, AEye’s vice president of marketing, explained to EE Times.
Indeed, AEye’s iDAR is “deeply rooted in the technologies originally developed for the defense industry,” according to Lambright. The startup’s CEO, Luis Dussan, previously worked on designing surveillance, reconnaissance, and defense systems for fighter jets. He formed AEye “to deliver military-grade performance in autonomous cars.”
Driving AEye’s iDAR development were “three principles that shaped the perception systems on military aircraft Dussan learned,” according to Lambright: 1) never miss anything; 2) understand that objects are not created equal and require different attention; and 3) do everything in real time.
In short, the goal of iDAR was to develop a sensor fusion system with “no need to waste computing cycles,” said Aravind Ratnam, AEye’s vice president of products. Building blocks of iDAR include 1550nm solid-state MEMS lidar, a low-light HD camera and embedded AI. The system is designed to “combine” 2D camera “pixels” (RGB) and 3D lidar’s data “voxels” (XYZ) to provide “a new real-time sensor data type” that delivers more accurate, longer range and more intelligent information faster to AV’s path-planning system, according Ratnam.
Notably, what AEye’s iDAR offers is not post-scan fusion of a separate camera and lidar system. By developing an intelligent artificial perception system that physically fuses a solid-state lidar with a hi-res camera, AEye explains that its iDAR “creates a new data type called dynamic vixels.” By capturing x, y, z, r, g, b data, AEye says that dynamic Vixels “biomimic” the data structure of the human visual cortex.
>> Continue reading page two of this article on our sister site, EE Times: "2019 AV Sensors: Vision, Radar, Lidar, iDAR ."