Advertisement

Paying attention to software processes to achieve higher quality code

Ed Liversidge

January 07, 2009

Ed LiversidgeJanuary 07, 2009

Software development is inherently difficult, expensive and error prone. Additional constraints are imposed when developing software for embedded systems, such as reductions in available memory, processor speeds and I/O devices, as well as influencing factors of the operating system, real time requirements and complex communication protocols. Therefore, it is vital that the right approach is used as the software is developed.

In order to assist the software developer, great minds of the 20th Century collaborated to discuss and implement the Waterfall model, and it's mutated cousin, the V-Model. These models are still widely used in industry today, especially in the defence industry.

The problem is that you cannot 'use' the V-Model to create software, anymore that you can 'use' an Airfix model to fly to Paris. In order for the V-Model to be 'used', it needs to be backed up with a structured software development process, which explicitly defines the outputs of each stage and how those outputs are produced.

The V-model of software development
Figure 1 below shows a typical diagram of the V-Model. The development life cycle follows a fixed path from Requirements Analysis to Operational Testing and subsequent delivery to the customer. Testing tasks are generated from knowledge gained from development tasks, for instance, the High Level Design will generate the Integration Tests. A project following this model will move through these tasks one at a time, moving on when the current task is completed.

This model does have a number of good points such as:

It defines tangible phases of the process, and proposes a logical sequence in which these phases should be approached.

It defines logical relationships between the phases.

It demands that testing documentation is written as soon as possible, for example, the integration tests are written when the high level design is finished, the unit tests are written when the detailed specifications are finished.

It gives equal weight to development and testing.

It provides a simple and easy to follow map of the software development process.

However, there are a number of criticisms that have been levelled at the V-Model:

It is easily misinterpreted as a working model, whereas it needs to be backed up with a good software development process.

It is rigid in its approach " it has no iteration, and it doesn't handle change

It is rigid in its testing methodology.

Figure 1: The V-model stands for victory in achieving software quality

The most damaging aspect of the V-Model is not in the model itself. Any model is an approximation, and this model does at least provide some value. The biggest problem arises from the users' steadfast reliance on the model, making the assumption that the model is a 'tool' to be 'used'. The V-Model cannot be 'used'. Only a good software development process can be used to create good software.

In addition, the user can further distort the picture by thinking that iteration can be added by cobbling together multiple V-Models to form some sort of 'WWWWW' model. This is nonsense without an iterative process to back it up and with such a process it is no longer a V-Model.

The third criticism levelled at the V-Model is in the testing phases and has been illustrated by Brian Marick. He explains that implementing unit tests and integration tests as single, separate phases results in a thoughtless approach to testing. For example, a single unit test will require a custom test harness.

Each unit may require a different test harness. For large projects with lots of units, this could prove to be costly and problematic. It could be a better idea to test a unit when connected to the actual system, using the system to deliver test messages. The point is that no thought is being applied to the trade-offs involved in testing early, testing late, or testing as-you-go.

All is not lost! There are many cases where the V-Model and the Waterfall model have been successfully used. In these cases, the requirements were correct before any development work was started (this requires more work up-front to make sure the requirements are accurate and unambiguous), and a software development process was used, in conjunction with the model.

In other words, a clear process was used to define and enforce the boundaries of the different phases of the model. One example of this is the (now outdated) US DOD standard MIL-STD-498 (and its predecessor DOD-STD-2167A), which requires review documents such as Software Requirements Specifications and Software Test Plans to be produced after every development phase (Figure 2 below).

Figure 2: MIL-STD-498 software development process.

(To see a bigger version of this diagram click here.)

So, the message here is that with the proper procedures and fixed and clear requirements, you could use the V-Model. However, there are other, newer processes that are better at handling changing requirements.

Accommodating change
Let us take a look at the software development process for a moment. As stated by P.G. Armour, in 'The Laws of the Software Process', software development can be viewed as a quest for knowledge. Like a climber planning a route over glaciers and up a mountain face, the destination is clear, but getting a safe route up the mountain and back down again is not easy.

The climbers may find impassable chasms, dangerous overhangs or unpredictable changes in weather. When planning as ascent, the climber is faced with two options: Invest in high resolution cameras, satellite images, remote camera drones and sophisticated weather prediction mainframes to gain knowledge and spend years planning the safest route before setting foot on the mountain, or plan as much as you can with the resources you have and then attempt the climb, knowing that the plan will most probably change as you go along.

The software manager is facing the same two choices, except that now they boil down into: determine all your requirements accurately and unambiguously, and research the software system to be developed in order to gain a complete understanding of the problem space, before you do anything else, or start your project with the understanding that not enough knowledge has been acquired about the software systems to be delivered, and that the requirements and the software development plan will most probably change.

These two choices can be further distilled down: use a rigid model, like the V-Model (backed up with a solid process and lots of upfront research), or use an Agile software development process

The problem here is that if you follow option 1, you will never have a complete picture of the software system until it is delivered to a satisfied customer.

There will always be a risk that some vital piece of knowledge in the problem domain has yet to be uncovered. The customer(s) may not even know what is required from the software.

More upfront research can reduce this risk but never eliminate it. Is it not better to accept that the plan will change and to accommodate this in your software development process? Isn't it time you started to look at Agile Software Development?

There are many different processes that can fall under the label of Agile Software Development, such as Extreme Programming (XP), Crystal, DSDM and Scrum. There are also many big guns in the software world that use Agile methods, such as Google, IBM, HP, Microsoft, Sun Microsystems and Yahoo. And yet, agile software development is often misunderstood.

It does not involve a bunch of hackers getting together and starting a coding frenzy! It is a well established (although not yet widely adopted) software methodology that emphasises frequent and rapid iterations through a software development process to deliver highly functional software that meets the customer's changing requirements. It encapsulates a number of related software development concepts, a small subset of which we will describe here, to give you a flavour of the Agile Software Development process.

Test driven development
In test driven development, you write your test code before you write your production code. Then, the production code is written which should match the test case in the new test code, making the failing test pass.

The amount of test code and production code should be small, say 5 lines of code each. Once the test passes, then the cycle repeats and the test code should be enhanced to develop a new test case, followed by more matching production code. In this way, the software will always have matching test cases, which are constantly executed, ensuring the behaviour of the system has been thoroughly exercised.

Continuous Integration
Once the code builds and passes its tests, you grab the integration token (which is a physical object to ensure only one integration happens at any one time), check in your code and then use a separate dedicated integration workstation to get the changes, run the build and execute all the tests. If it works, then the integration was a success, and the rest of the team can get the changes and work with them. If it didn't work, then the problem needs to be fixed. Go back to your workstation and find out why the build is broken.

Continuous Integration means that integration problems are found and resolved frequently, so that the code is always ready to ship, even if all the required functionality has not yet been developed.

Pair programming
The idea of having a colleague looking over your shoulder pointing out syntax errors fills many computer programmers with dread and rightly so. Fortunately, pair programming is far from this. It is in fact an extremely powerful and productive way to work. In essence, one person will code; the driver, the other person will navigate " staying one step ahead of the coder and thinking about the next tests that are required to develop the next piece of production code.

These two will swap roles frequently and pairs will pair up with different members of the team throughout the working week. This level of interaction can greatly increase the amount of brain power applied to the software development process.

Unfortunately, the Waterfall and V-Models are not process models, they are abstract models and the models do not define the process to be used to create software. Thus, they are open to misuse and misinterpretation and can lead managers into a false sense of security, as they rely on the model when they should be relying on a process.

Even with a good process, the V-Model has no iteration, and does not handle change. If the risk of change is low, then you can get away with this, but in any project where the risk of change is high, then it is best to look at Agile Software Development processes in order to embrace change.

Ed Liversidge is with Harmonic Software Systems.

Loading comments...