According to a study from the American Automobile Association (AAA), active driving assistance (ADA) systems that automate steering and braking in a growing number of vehicles are not providing reliable safety benefits, and recorded disruptions and disengaged roughly every eight miles.
The study found that over the course of 4,000 miles of real-world driving, vehicles equipped with active driving assistance systems (ADA) experienced some type of issue every eight miles, on average. Researchers noted instances of trouble with the systems keeping the vehicles tested in their lane and coming too close to other vehicles or guardrails. AAA also found that active driving assistance systems, those that combine vehicle acceleration with braking and steering, often disengage with little notice – almost instantly handing control back to the driver. This can lead to a dangerous scenario if a driver has become disengaged from the driving task or has become too dependent on the system. AAA recommends manufacturers increase the scope of testing for active driving assistance systems and limit their rollout until functionality is improved to provide a more consistent and safer driver experience.
Mark Hermeling senior director and automotive software security/safety expert at GrammaTech, which provides code testing technology to automotive and aviation manufacturers for embedded applications like ADAS had the following to say about the findings:
“ADAS systems consist of complex software components, some of these components interpret the driving environment using multiple sets of sensors such as video, radar and LiDAR as well as sensors inside the car. Then different software makes decisions based on this information, and lastly a separate set of components take actions. Often these interconnected components have been built by different vendors. Extensive code testing must be performed during the various development phases of these individual software components to ensure that when integrated, they work as intended in real world driving conditions. Unlike traditional software applications where defects can be detected and corrected after they are released, ADAS software cannot afford to be released with bugs due to the safety implications involved and cited by the AAA Study.”
Active driving assistance, classified as Level 2 driving automation on a scale of six (0-5) created by SAE International, are advanced driver assistance systems (ADAS) that provide the highest level of automated vehicle technology available to the public today. This means for a majority of drivers, their first or only interaction with vehicle automation is through these types of systems, which according to AAA, are far from 100% reliable.
“AAA has repeatedly found that active driving assistance systems do not perform consistently, especially in real-world scenarios,” said Greg Brannon, director of automotive engineering and industry relations. “Manufacturers need to work toward more dependable technology, including improving lane keeping assistance and providing more adequate alerts.”
AAA tested the functionality of active driving assistance systems in real-world conditions and in a closed-course setting to determine how well they responded to common driving scenarios. On public roadways, nearly three-quarters (73%) of errors involved instances of lane departure or erratic lane position. While AAA’s closed-course testing found that the systems performed mostly as expected, they were particularly challenged when approaching a simulated disabled vehicle. When encountering this test scenario, in aggregate, a collision occurred 66% of the time and the average impact speed was 25 mph.
“Active driving assistance systems are designed to assist the driver and help make the roads safer, but the fact is, these systems are in the early stages of their development,” added Brannon. “With the number of issues we experienced in testing, it is unclear how these systems enhance the driving experience in their current form. In the long run, a bad experience with current technology may set back public acceptance of more fully automated vehicles in the future.”
AAA’s 2020 automated vehicle survey found that only one in ten drivers (12%) would trust riding in a self-driving car. To increase consumer confidence in future automated vehicles, it is important that car manufacturers perfect functionality as much as possible – like active driving assistance systems available now – before deployment in a larger fleet of vehicles. AAA has met with industry leaders to provide insight from the testing experience and recommendations for improvement. The insights are also shared with AAA members and the public to inform their driving experiences and vehicle purchase decisions.
Addressing the issue of whether current safety critical software testing standards are not meeting the mark, and what he recommends they can do. GrammaTech’s Mark Hermeling told Embedded, “Safety standards have been around for a while and they are geared towards best practices to deliver high quality code that meets requirements. A great example is the use of the MISRA coding standard, it is a must-have to deliver high-quality, easy to understand code. However, it is not a security standard. MISRA compliance is an insufficient metric in this day and age where cyber security is important as well. MISRA compliant code can very well still harbor a cyber security vulnerability.”
“Deep static analysis using data flow, control flow and abstract execution analysis is required to find these issues. As an example, one of our customers was building ADAS software. They had MISRA compliant code, but using CodeSonar (using its abstract execution engine), they found a massive buffer overrun in error handling code. This was not actively being tested as it was an error handling code. Still, given the millions of hours that an ADAS system will log during its lifetime, the chance was significant that this issue would have been hit and would have resulted in a software and possibly vehicular crash.”
So asked what can be done to address this, Hermeling added, “Safety alone is not sufficient, people need to focus more on cyber security. Without security, there is no safety. Coding standards alone will not get you there, static application security testing (SAST) is required. This needs to happen at every software change, and helps ensure the quality and security of the code. Static analysis should include abstract execution, not just coding standards.”
Background to the AAA study
The building blocks of autonomy are already being deployed within vehicles available to the general public. Examples include adaptive cruise control, lane keeping assistance, automatic emergency braking, parking assistance and more. Active driving assistance (ADA) integrates both longitudinal and lateral motion control and is the most advanced semi-autonomous vehicle technology available to the consumer.
Within the industry, these systems are known as Level 2 driver support features. The study emphasizes that currently available systems are only capable of assisting the driver within certain environments; the driver must remain attentive and maintain control of the vehicle at all times. The purpose of the research was to provide an ongoing evaluation of publicly available ADA systems. Within this work, the performance of ADA systems on vehicles available for sale throughout the U.S. were assessed. Evaluations were conducted on a closed-course and public access roadways to characterize performance in terms of lane-keeping and adaptive-cruise functionalities.
According to the National Highway Traffic Safety Administration (NHTSA), there were approximately 6,734,000 police-reported motor vehicle crashes across the U.S in 2018, resulting in 2,710,000 injuries and 36,560 fatalities. While society waits to see if the widespread deployment of fully autonomous vehicles ultimately becomes a reality, some advanced driver assistance systems (ADAS) are already contributing to a reduction in crash rates. A 2018 AAA Foundation analysis estimated that if driver assistance technologies were installed on all vehicles, they would have had the potential to help prevent or mitigate roughly 40 percent of all crashes involving passenger vehicles.
Through Foundation research, AAA is also working to better understand driver performance and perceptions of the technology. ADA systems are a subset of ADAS. The safety benefits of available ADA systems are less certain because they are typically designed to be engaged in highway driving environments where crashes are usually infrequent. Even if all interstate miles were covered by vehicles with ADA systems that prevented all fatalities and injuries, the maximum overall benefit would be a 17 percent reduction in crash fatalities and a 9 percent reduction in crash injuries.
Regardless of actual safety benefits, current ADA systems are designed to reduce the workload on the driver in highway driving environments which can potentially promote fatigue as a result of monotonous surroundings. The number of vehicles with an ADA system as either standard or optional equipment on at least one trim level continues to increase with each subsequent model year. For the 2020 model year, ADA systems are standard on 10 percent of new vehicles sold in the United States. This figure increases to 34 percent for new vehicles that have an ADA system as either standard or optional on some or all trim levels.
AAA conducted closed-course testing and naturalistic driving in partnership with the Automobile Club of Southern California’s Automotive Research Center and AAA Northern California, Nevada and Utah’s GoMentum Proving Grounds. Using a defined set of criteria, AAA selected the following vehicles for testing: 2019 BMW X7 with Active Driving Assistant Professional, 2019 Cadillac CT6 with Super Cruise, 2019 Ford Edge with Ford Co-Pilot360, 2020 Kia Telluride with Highway Driving Assist and 2020 Subaru Outback with EyeSight, and were sourced from the manufacturer or directly from dealer inventory. The 2019 Cadillac CT6 and the 2019 Ford Edge were evaluated only within naturalistic environments. For specific methodology regarding testing equipment, closed-course test scenarios and naturalistic routes, please refer to the full report here.