Before I become a software consultant, I spent over 10 years in industry developing and testing software. Back then the “waterfall” process dominated software development with its distinct phases of analysis, design, code and test.
Each phase was performed in isolation with the output of one being the input for the next. The final output was a working system which passed all tests.
With the waterfall approach, the purpose of the analysis phase was to refine the stakeholder's vision of the system and produce a list of requirements, with those for software being itemized in the Software Requirements Specification (SRS). The SRS—always a revered document, its quality typically gauged by its thickness and weight—identified your status in the project team.
You were somebody if you could proudly display it on your bookshelf! Of course, no sooner had a print run finished than the SRS was out of date due to a newly-found error or ambiguity.
No matter how much a project manager wished for the SRS to be error-free, it never was and the change log would begin increasing in size until a new print run became inevitable.
While the waterfall adherent's wish for a stable requirements baseline was understandable, the process by its very design dooms a project to a constant state of instability.
At its core is an ideal belief that each phase flows with near-perfection into the next and any errors or inconsistencies can be quickly smoothed out via a feedback loop. This may work for two phases; you effectively have a cut-down spiral process.
But when you introduce a third or fourth phase, the feedback loops multiply to the point of being unmanageable. Thus when testing uncovers a problem with the SRS, that problem ripples up through all the phases with source code and design inevitably affected. Once the SRS is updated, changes ripple back down again no doubt spinning off secondary and tertiary problems along the way.
Contemporary software development processes and practices address many of the deficiencies found in the “waterfall” process. The Unified Process and Agile methods evangelize iterative approaches, which are well supported by modern configuration management, modelling, development and testing tools.
Unfortunately, the greatest investment has been in improving the design and coding phases since engineers find software construction the most productive and exciting periods on a project.
Requirements management and traceability—typically viewed as unattractive work—is often relegated to a secondary task, if done at all. Testing, unless mandated by a strict standard such as DO-178B, is squeezed into the end of the project plan if time allows.
Unfortunately, for companies with project teams who operate like this, the cost of identifying and correcting software defects grows by orders of magnitude as they progress undetected from one process phase to the next. Catching defects as early as possible is critical to controlling cost and meeting project timescales.
With up to 70% of all project defects traceable to faulty requirement specifications, it is sheer madness to relegate the management of requirements to a low-priority task. Evidence demands that requirements management and traceability be a key, over-arching activity on any project where quality and stakeholder satisfaction is a concern.
Project managers need to apply the same enthusiasm for investment in requirements analysis as they do for design and coding. Where system and software modelling has been adopted to increase clarity, reduce ambiguity and achieve abstraction, introduce Use Cases or User Stories for the very same reasons.
Where source code is subjected to automated rules checking (e.g. MISRA) and quality analysis (e.g. cyclometric complexity), introduce similar automated checking of requirement specifications (e.g. the presence of particular keywords, the absence of imprecise phrases).
Where trace links are specified between code components and the design elements which they implement, add further trace links to requirements and confirm that (i) all requirements in scope have been implemented and (ii) there is no design or implementation element which cannot be traced to a requirement.
Requirements are the foundation of every project. A weak foundation results in high numbers of defects, unforeseen remedial work, spiraling costs and missed deadlines. Investment in requirements management, equal to that made for design and coding, is necessary to secure a firm foundation on which to construct a successful project.
Brian Hooper is a Field Applications Engineer at LDRA Ltd.. He graduated from Brunel University in 1989 before beginning a career in software development, first at GEC Marconi in the UK, then at several large defense contractors across mainland Europe. In 1999, Brian joined Rational Software to support their successful Aerospace & Defence business, making the most of the software development lifecycle skills and experience he had picked up in industry. Brian has been with LDRA since 2007 and specialises in requirements management, requirements traceability, process and development best practices.