Tech companies refocus on Level 2+ driver assistance -

Tech companies refocus on Level 2+ driver assistance


LAS VEGAS — The Consumer Electronics Show this week revealed the auto industry’s new normal: Autonomous vehicle (AV) tech suppliers are shifting into reverse gear, moving back to Level 2 driver-assistance cars instead of stepping on the gas toward the Level 4/Level 5 driverless future.

Nvidia and Intel/Mobileye, two technology giants racing to enable full autonomy, have brought to Las Vegas an almost identical pitch: “Let’s start saving people’s lives today rather than waiting for many years to come.” They are proposing to do so by trickling down the robocar technologies that they’ve developed to advanced driver-assistance systems (ADAS) vehicles.

Tech suppliers and carmakers are now drawing new battle lines in the intentionally vague realm of “Level 2+.”

Nvidia introduces ‘Level 2+’ vehicle at CES.

Never mind that Level 2+ is not a formal automation level defined by SAE. Tech suppliers are independently defining their own Level 2+, each proposing proprietary AV technologies to be incorporated in ADAS cars.

On one hand, Mobileye is pitching its two pillar technologies to enable AV — crowd-sourced mapping called Roadbook and Responsibility-Sensitive Safety (RSS) — for its Level 2+ ADAS. Nvidia is nudging carmakers to go whole-hog, adopting its processing-power-intensive Nvidia Drive AGX Xavier to enable “Nvidia Autopilot” and driver monitoring system in its own fully fledged Level 2+.

The tech suppliers’ renewed focus on ADAS is welcome. But this move begs a few questions. First, the rush to promote loosely defined L2+ will inevitably introduce confusion into the market. Second, it remains unclear if automakers — who have never really made money by pitching safety — will buy into the idea of putting more expensive AV technologies into ADAS cars. Third, by using underlying AV technologies that aren’t exhaustively tested yet in the real world, how much safer can carmakers make their ADAS vehicles?

Just to be clear, the industry’s aspirational goal for self-driving cars is still intact. As Chris Jacobs, vice president, autonomous transportation & automotive safety, Analog Devices, Inc., predicted, the industry is following two tracks — ADAS and fully autonomous vehicles (à la robo-taxis).

But tech suppliers are using much more cautious language to describe the state of the AV market.

Amnon Shashua

For example, Amnon Shashua, senior vice president at Intel and president and CEO of Mobileye, an Intel Company, noted:

We know self-driving cars are technically possible. But the true challenge to get them out of the lab and onto the roads lies in answering more complex questions, like those around safety assurance and societal acceptance.

In proposing Nvidia-defined L2+, Danny Shapiro, senior director of automotive at Nvidia, told EE Times that the move to L2+ creates an opportunity “to bring Level 4 capability down to the mainstream vehicle segment in a very near term as opposed to some vehicles that are many, many years out.”

Diverging Level 2+ definition
Phil Magney, founder and principal at VSI Labs, believes that “the SAE ratings are becoming less relevant and the lines are blurring.” Especially with Level 2+, he acknowledged, “It is getting confusing.”

Level 2+ can mean a lot of things, but “mostly people are using that label to identify something that is more than active land keeping plus adaptive cruise control,” observed Magney. “Or more than camera plus radar.”

While he perceives Level 2+ as “a kind of middle ground between 2 and 3,” Magney noted, “I like to call it ADAS 2.0. This means all of the active safety systems plus automation features.”

In an interview with EE Times, Jack Weast, principal engineer & chief architect of autonomous driving solutions at Intel, described Level 2+ as “just a small example of what it would mean to apply fully autonomous driving capabilities to a human-driven vehicle.”

Intel/Mobileye-defined Level 2+ is a camera-based ADAS system that uses both Road Experience Management (REM) technology and RSS, a mathematical approach originally designed for AVs’ safer decision making.

Mobilye CEO explains its AV/ADAS strategy at CES. Building blocks of Autonomous Vehicles can be leveraged to achieve “Vision Zero” for safety. (Source: Mobileye)

At the heart of Mobileye’s REM is a massive data load sourced by Mobileye’s cameras. REM is allowing Mobileye to build and maintain “an accurate map of the environment in near-real time.” Weast explained that such a highly accurate REM map “with exact lane boundaries and widths” makes it possible for ADAS 2+ vehicles to keep lanes on roads where lane markings are invisible — either worn out or under bad weather such as heavy or even light rain or snow.”

Furthermore, Mobileye is proposing Level 2+ vehicles with RSS. Weast believes that RSS technology can be used to augment automatic emergency braking (AEB). Mobileye calls it automatic preventative braking (APB) . Using formulas to determine the moment when the vehicle enters a dangerous situation, APB would help it return to a safer position by applying small, barely noticeable preventative braking instead of suddenly braking to prevent a collision, explained the company.

>> This article was originally published on our sister site, EE Times: “Nvidia, Mobileye Scheme ‘Level 2+’.”

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.