Using CMMI for software requirements testing in system design & development - Embedded.com

Using CMMI for software requirements testing in system design & development

Requirements-based testing, and its inherent process of requirements traceability and verification, is widely viewed as a best practice according to corporate standards such as the Capability Maturity Model Integration.

CMMI can guide process improvement across a project, a division or an entire organisation with benefits established for critical as well as non-critical software. CMMI for Development v1.2 contains 22 process areas, amongst them Verification (VER) and Validation (VAL) which confirm that the requirements are properly reflected and that the delivered product fulfils its intended use.

As illustrated by the CMMI Engineering process category in Figure 1 below , the foundation for construction comprises the Requirements Management (REQM) and Requirements Development (RD) process areas.

The Technical Solution (TS) involves designing and developing solution components according to the requirements. Product Integration (PI) sees the solution components assembled to form an end product ready for delivery. The Verification process area (VER) ensures that each solution component meets its specified requirements, while Validation (VAL) demonstrates that a product fulfills the end user's needs.

Figure 1. CMMI Engineering Process Areas

Verification and validation are often confused or considered as a single process area, understandable considering that both are concerned with testing according to requirements.

The principle difference is that Verification proves that components have been built to the specifications developed in RD, implying bottom-up reviewing and testing, while Validation confirms that a product addresses the end user's feature set, implying top-down system testing.

Through adopting CMMI, the specific practices associated with each process area construct a framework of best practices for the project. For example, requirements stipulates the need to manage change of the requirements through the life of the project as well as maintaining bidirectional traceability of requirements to downstream artifacts, while verification states that peer reviews should be carried out for all components.

Furthermore, the process areas interact and must stay synchronized as the project progresses. For example, component requirements developed in requirements development must be managed with the same discipline as the end user's requirement set, meanwhile verification and validation contribute tests and test results to the requirements traceability repository.

The development of a CMMI-inspired process framework is certainly a good start. The needs and goals for a project will now be fairly clear. But how are these needs and goals achieved? What workflow processes will drive the requirements through design, implementation and test? Which tools will automate, support and optimise tasks such as requirements managements and component verification?

Requirements-Driven Development
The CMMI Technical Solution (TS) dictates that solutions are constructed according to requirements. The development team must determine if all requirements are going to be implemented at once or if it's better to focus on a few requirements at a time and construct a solution incrementally.

CMMI does not address these issues. It is up to the project team to adopt development practices and processes which guide project progress while also meeting the goals of CMMI.

Following a “waterfall” development model has many disadvantages and led to the rise of alternative models such as the Unified Process and Agile methodologies.

Latter day methodologies typically place strong emphasis on the management of requirements as well as the organization of requirements as a driver for downstream development. A couple of key principles shared by the majority of methodologies may be summed up as follows:

1. Straightforward, user-focused formulations of requirements are preferred to long lists of individual features or constraints. Variously called “Use Cases” or “User Stories”, they enable better collaboration with the end user as well as placing focus on the value or benefits that the solution should deliver.

2. Requirements change over the life of the project. This is natural and inescapable, whether due to the end user changing their vision of the solution or the project redefining its scope. Projects therefore should be organised and managed to accept and embrace change.

Use Cases are formally defined in the Unified Modelling Language (UML) while User Stories are an element of Extreme Programming; however the concept behind each is roughly the same ” to partition the system with a coarser granularity than a list of requirements, while at the same time considering what the system delivers to the end user rather than simply what it does.

For example, an engine management subsystem specified as a bullet-point list of requirements inevitably inspires thoughts of implementation whereas the focus at this stage needs to be on what the subsystem delivers to the other subsystems to which it connects. User Stories are typically written on small paper cards or even Post-It notes, enabling easy collaboration and discussion by project teams.

Use Cases can be formulated in the same way or created with a UML tool, as illustrated in Figure 2, below . Some key points to note on this Use Case Diagram are that the system boundary is clearly marked, the Use Cases themselves sit within the system boundary and all Actors (note that the stickman representation does not imply that an Actor must be a human being) are outside the system.

Figure 2. Use Case Diagram

Each Use Case or User Story comprises several scenarios. The first scenario is always the “basic path” or “sunny day scenario” in which the actor and system interact in a normal, error-free way.

Returning to the engine management subsystem example, a scenario may cover idling during which the fuel supply remains constant and the temperature stays within limits. Then alternative and exception scenarios need to be considered in which the system handles problems or failures, such as fuel cut-out.

The end user must be fully involved in the development of scenarios and, as each is completed, a priority is assigned to enable the complete set of scenarios to be ranked. As illustrated in Figure 3 below , the project team needs a priority-ordered list of scenarios to plan each iteration and select which portion of the system will be implemented.

Figure 3. Extreme Programming Workflow

Once verification of an iteration is complete progress metrics may be derived. The team needs to ask, “Were all selected scenarios implemented and tested? Was any regression seen in previously completed iterations?” Through regular analysis of progress, the calculated “Velocity” can be used to plan future iterations.

Meanwhile, during development, the end user may have changed or updated their requirements resulting in modifications to the priority-ordered list of scenarios. An iterative approach allows projects to be flexible to both an end user's changing demands as well as development progress.

Requirements Testing and Bidirectional Traceability
Testing is often the poor relation in software development. One of the major disadvantages of a “Waterfall” approach is that testing is the final phase of the lifecycle and inevitably becomes squeezed as earlier phases overrun, yet the delivery date is usually immovable.

As a result, it is fairly common to deliver software components which are known to be sub-standard simply because there hasn't been time to complete sufficient testing. However, if defects remain in delivered solutions, problems arise during service and can lead to expensive investigations, product recalls, damage to the reputation of the supplier, and the potential for loss of business.

It's a common misapprehension that tests are created to verify the software. In fact, tests should always verify and validate the requirements. It's the requirements which are agreed with the end user therefore the acceptance of the delivered solution is judged on whether the requirements have been met. In Figure 1, verification and validation both reference requirements development to illustrate the goal that requirements drive the testing effort.

However requirements are captured, whether as traditional lists or more contemporary User Stories, a method for tracing them to implementation and test components is vital to ensure that:

(1) requirements drive the development effort
(2) features unrequested by the end user are not added to the solution by free-thinking engineers.

Agile techniques reveal a shortcoming at this point; how exactly do you create a trace relationship between a piece of card and a file on a computer? Even if a manual system is put in place, perhaps involving a matrix drawn on a whiteboard, how is the history of the relationships maintained and how is change controlled?

It is sufficient to develop the initial set of User Stories using Agile techniques. However the move to a tool-based record of the requirements should be made shortly afterwards to allow the project to exploit features that ensure efficient management and implementation.

For example, Requirements Development states that component requirements are developed from the end user's requirements; a requirements management tool supports the creation of trace relationships between these different layers of requirements and highlights where end user requirements have yet to be refined into component requirements.

With requirements stored in a management tool, it becomes possible to drive implementation and testing via integration with other tools and further contribute to the efficient running of the project.

Some tools import requirements from a number of sources, such as Telelogic DOORS or Microsoft Office documents, and enforces a workflow in which the requirement is always the principal focus. Specific component requirements may be defined as necessary. Ultimately, verification is requirements-focused utilizing the capabilities of the appropriate companion tool suite to carry out each test.

Figure 4. Tracing from Requirements to Verification

Figure 4, above, illustrates very clearly that the allocation of requirements to engineers, as well as the verification efforts of each engineer, implicitly creates trace relationships and avoids the task of preparing a requirements traceability matrix as a separate, manual activity.

The matrix, along with various test overview reports, may be automatically generated on-demand. In addition, project metrics are always available to gauge progress or highlight issues.

Conclusion
As companies strive to improve processes and reduce software defects, a standard such as CMMI lays a solid foundation into which a project team may integrate suitable methodologies and tools.

CMMI's engineering process category highlights requirements management as a major process area which drives implementation, verification and validation. With the right mix of methodologies and tools, project teams can satisfy this stipulation as well as improve their development efficiency and ensure higher quality in the delivered solution.

Brian Hooper is a Field Applications Engineer at LDRA Ltd. He graduated from Brunel University in 1989 before beginning a career in software development, first at GEC Marconi, then at several large defense contractors across Europe as well as at Rational Software. Brian has been with LDRA since 2007 and specializes in requirements management, requirements traceability, process and development best practices.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.