ESC Fall 2007 Preview: If you want to know what's wrong with your software, break it -

ESC Fall 2007 Preview: If you want to know what’s wrong with your software, break it


T. Adrian Hill must often feel a kinship with the salmon which eachyear swim upstream against the current, especially when he is called onto speak on software stress testing.

A member of the Principal Profession Staff at the John HopkinsApplied Physics Laboratory, he has led the embedded softwaredevelopment on numerous NASA missions including the Hubble SpaceTelescope. The software stress testing techniques developed at the Lab,he said, are in many ways the antithesis of the traditional softwareacceptance testing that is the norm for most embedded systems designs.

At the upcoming Embedded Systems Conference in Bostonstarting September 18, he will be presenting a class on Stress Testing Embedded SoftwareApplications (ESC-302), which involves, in essence, findingout what is wrong with the software by breaking it, or by stressing itto the breaking point.

In his class he will go into detail on the methodologies developedat the Lab for the stress testing of software included on threerecently launched NASA missions, the MESSENGER mission to Mercurylaunched in August, 2004; the New Horizons mission to the Pluto-KuiperBelt, launched in January, 2006 and the STEREO mission to study the Sunclose up, launched in October of 2006.

In the traditional approach test engineers develop and execute testthat are defined to validate software requirements. “Testing is astandard phase in nearly every software development methodology,” saidHill. ” Test engineers develop and execute tests that are defined tovalidate software requirements. The tests tend to be rigid withspecific initial conditions and well-defined expected results.

The tests typically execute software within the limits prescribed bythe software design. While these tests are often complemented withSystem Tests or Use Case Tests, these higher level scenarios stillconform within the design bounds of the software.”

Software stress testing, however, runs counter to these traditionalapproaches, said Hill. “Stress testing involves intentionallysubjecting software to unrealistic loads while denying it criticalsystem resources,” he said. “The software is intentionally exercised'outside the box' and known weaknesses and vulnerabilities in thesoftware design may be specifically exploited.”

However, there are some caveat's to keep in mind when conductingsuch tests, according to Hill. “Degraded performance of a system understress may be deemed perfectly acceptable, thus, the interpretation oftest results and definition of pass / fail criteria is moresubjective,”he said.

“Furthermore, a test that stresses one aspect of the software maylead to undesirable side effects in another area of the software so theentiresystem behavior as a whole must be evaluated to properly analyze theresults.”

Even so, the results of such testing on the NASA missions, he said,proves the usefulness of this approach. There, the stress testinguncovered a total of 32 problems across the three software efforts,which Hill details and analyses in his class.

Based on his experience, Hill believes software stress testingshould be an essential component for any critical embedded softwaredevelopment program.

“While such testing is typically not written as a formalrequirement, users have an expectation that the software demonstratesthe characteristics of robustness and elasticity in response to anyuser actions,” he said. “Furthermore, stress testing can expose designflaws and software bugs that are not easily uncovered using traditionaltesting methods.

“The problems uncovered in stress testing often involve the complexinteractions between tasks such as missed real-time deadlines,deadlocks, race conditions, and reentrancy issues. A formal andrigorous approach to Software Stress Testing can uncover seriousproblems before the software is released into the user community.”

Other software development, debug and testing classes worth checkingout at the conference include “ManagingEmbedded Projects (ESC-106), ”  and “Strategiesfor Building Reliable Systems(ST-1),”taught by Jack Ganssle; “Testdriven development for embedded software (ESC-222),” presented byJames Grenning;  and “KeepingAgile: Test-centric software  development (ESC-242)” by BrianCruickshank.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.