Advertisement

Bumps in the road to AV safety

July 29, 2019

junko.yoshida-July 29, 2019

Few industry insiders were surprised last week when Cruise, General Motors Co.’s self-driving unit, dealt autonomous driving a major setback by delaying deployment of the first GM robotaxi, which it had promised to introduce in late 2019.

The announcement was an admission of the fast-growing realization that getting a highly automated vehicle on the road might not be rocket science but it’s not far from it either.

The lingering mystery is whether there’s a practical way for AV companies to demonstrate that their vehicles are safe. “Historically the autonomous vehicle industry has operated under a cloak of secrecy,” Phil Koopman, CTO, Edge Case Research observed.

Winning the trust race

But if AV companies are serious about winning the trust race, more honesty and transparency about their self-driving cars is essential.


Jack Weast

Last week, we caught up with Jack Weast, vice president, autonomous vehicle standards, and senior principal engineer at Intel. He said , “I think both industry and media have been complicit in hyping this and not being open and honest enough about the realities of the technology.”

Indeed, we the media share some blame. Many reports tend to frame the topic as an AV horse race. Weast said, “Yeah, the horse race has encouraged people to declare crazy dates [for AV launch].”

The AV industry “aspires to have a zero-accident future, but as long as there's human drivers mixed in on the roads with automated vehicles, there's going to be accidents,” noted Weast. “I think the trick is figuring out this question of how safe is safe enough, and how do you accept that?”

The key questions are: How does any AV company know when its AVs are safe enough for commercial launch? And who gets to decide that?

Explaining Cruise's decision to delay robotaxi deployment, GM Cruise CEO Dan Ammann blogged that “in order to reach the level of performance and safety validation required to deploy a fully driverless service in San Francisco,” GM will significantly increase “testing and validation miles.”

Unfortunately, if there were a set quantity of “testing and validation miles” Cruise must travel before the commercial robotaxi launch, Ammann neglected to mention it. Nor did he discuss the specific “thresholds” or “requirements” he believes his AVs must clear.


Phil Koopman

Thus far, few AV companies have disclosed their test goals. None have they articulated how they are measuring the safety of their AVs before, during or after testing.

Edge Case Research’s Koopman would like to see, at minimum, AV companies “publishing safety metrics to demonstrate they are operating safely,” before test cars hit the streets.

But local governments today are demanding very little from companies seeking permissions to operate public road tests. The public is kept in the dark until the next accident, or another AV company admits its robotaxis are not road-ready.

Koopman wrote in his blog, “In fairness, if there is no information about public testing risk other than hype about an accident-free far-flung future, what is the public supposed to think? Self-driving cars won't be perfect. The goal is to make them better than the current situation. One hopes that along the way things won't actually get worse.”

5 Things Still Missing in AV Development Practices

EE Times talked to several safety experts and automotive analysts to find out what paths the AV industry might take to win public trust. Combining what we've heard from multiple sources, there are five things still missing in AV suppliers’ vehicle development practices:

  1. Establishing metrics for testing,
  2. Adopting “safety by design” AV development,
  3. Sharing, among AV developers, of data collected during testing,
  4. Building a feedback loop, and
  5. More sophisticated system-level simulations.

1. Making the safety case before public road testing

First, let’s talk about the safety of public-road testing.

Koopman insists that AV testers must ensure safety during street tests. If an AV commits another Uber-like fatality, that one accident could kill the AV market altogether. He listed three common misconceptions about public AV testing.

a) A magical deadline.
This is the belief that there will come a day when AV companies wrap up public road testing. Then automakers will launch perfect, accident-free, highly automated vehicles and they’ll never have to test them again.

“Not so,” said Koopman. “Public road testing will be with us for a long, long time.” Every time a new sensor is added to an AV model, or when a robotaxi starts driving in a new ODD (Operational Design Domain), a new set of road tests is required. “Testing will never go away.”

b) Safety drivers.
Even after the Uber fatality, the standard argument for public road testing is, “Don’t worry. We have a safety driver.”

In Koopman’s view, safety drivers are not so all-fired safe. Indeed, the probability of a supervisor failure appears to increase as the autonomy failure rate decreases, he explained. But what if we were to install not one but two safety drivers? This is not much better, said Koopman. There have been incidents of both pilots in a passenger aircraft falling asleep.

Well then, how about a driver monitoring system? “That’s not a guarantee either,” Koopman said. “There are reports that some truck drivers have learned to sleep without closing their eyes.”

Safety drivers, pilots and truck drivers are all human. They get bored and they get tired. “You can’t bargain with human nature,” said Koopman.

c) Disengagement
By law, people actively testing self-driving cars on public roads in California are required to disclose the number of miles driven and how often human drivers had to take control, a moment of crisis known as a “disengagement.”

Koopman doesn’t believe this is the right metric. Disengagements tend to incentivize test operators to minimize interventions, which could lead to unsafe testing.

Any serious effort to build a safer AV must use disengagement data to improve the technology, not to tout victory in a safety contest. AV testing operators should see every incident, mishap and near-miss as a failure in the test-program safety process.

Koopman told EE Times , “What scares me is the ones safety drivers didn’t notice. We don’t know if we just got lucky or we should do something before we get unlucky.”

Given that public trust in autonomous vehicle technology has already eroded, he told EE Times , “Each new mishap has the unfortunate potential to make that situation worse, regardless of the technical root cause.”

Koopman wants AV testing operators to make the safety case first before hitting the road. Citing his paper entitled “ Safety Argument Considerations for Public Road Testing of Autonomous Vehicles,” co-authored with Beth Osyk, Koopman said, “We wrote that paper to provide a public starting point.” He added, “I'm sure other approaches would work as well, but ultimately all the things in that paper must be dealt with one way or another.”

Click for larger image
Click here for larger image
High-Level On-Road Testing Safety Argument (Source:  Technical paper by Phil Koopman)

High-Level On-Road Testing Safety Argument (Source: Technical paper by Phil Koopman)

>> For the next four things still missing from AV development practices, continue reading page two of this article originally published on our sister site, EE Times: "The Road to AV Safety Has Potholes."

Loading comments...