Evolution And Reliable Systems - Embedded.com

Evolution And Reliable Systems

During most of my professional life as a technical editor and writer parallels between two dominant information technologies of the last 100 years have fascinated me. One technology is computer-based and the other biological. The first is based on binary two-state logic and the other on four-state logical coding possibilities of the genetic biochemical building blocks adenine, cytosine, guanine, and thymine.

Often the connections I see between the two realms are interesting merely from an historical point of view. But occasionally, I spot something in one that might yield useful information about the other.

As net-centricity evolves in the computer industry, I see similarities between the way software is developed and how mechanisms that evolved in biological systems to ensure the survival of an organism also lead to its extinction when there is a rapid change in the environment.

In biological systems, geneticists have puzzled over why and how living organisms evolve particular features and capabilities, how these contribute to their success, and how they also play a role in their demise. When biologists look at organisms, they find that all elements of the system are not “built” to the same set of specifications and tolerances. Some portions of the organism are over-designed, too tough for the environment in which they are used, and built to tolerances far beyond others in the organism. Over time, however, the tolerances of the organism's constituent parts even out to reflect the conditions in which the organism lives, essentially “dumbing down” to the lowest common denominator.

Biologists often use an automobile analogy to describe this mechanism. The “selfish gene” is like the executive of an automobile company who commissions a survey to determine the failure rate on its vehicles. The survey shows that there is normal wear and tear for the conditions in which most of the components, such as axles, brakes, pistons, tires, and spark plugs, must operate. But suppose there is one exception, maybe the tie rods, which have many years of use left in them. In nature, the selfish genes would conclude that the tie rods were too well made for the conditions under which most autos were driven. And like any good executives, the selfish genes would order that in the future they should be made to an inferior specification, more in line with the conditions in which the auto operated.

The decision the selfish genes are left with is similar to that of the auto executive: should he build an auto with parts that have the reliability and longevity of those hardy tie rods, or should he dumb down the tie rods to the specifications of the rest of the vehicle? In nature, the genes do not care about the survival of a particular vehicle, but gamble in favor of producing large numbers of vehicles optimized for the typical driving conditions. Eventually, in a population of organisms, the dominant strain would be the one whose components were most perfectly adapted to their environment. Other organisms would survive, but in much smaller niches.

In most computing environments, a similar kind of mechanism seems to operate with reference to the reliability and defect level of the software that is produced. Essentially, reliability seems to be dumbed down to level that an application or market segment will accept. Whether it is conscious or unconscious on the part of the companies and engineers, I am not sure. Except for occasional conversations with software developers where in their weaker moments they have admitted that such decision making goes on, most of the evidence I have is indirect, implicit rather than explicit.

In the wonderful world of desktop computing, it seems to me that some sort of process like this is going on. In my experience, the reliability of Windows 3.0 and Windows 95 operating systems was a joke. But for whatever reasons it was acceptable to the market that then existed for desktop computers. Its main competitor, OS/2 from IBM, which had a record of reliable operation that matched that of many of the OSes on workstations and servers, was relegated to a small niche market.

There is much more focus on reliability in the embedded world, albeit varying with the application and the market. Even here, however, the existence of a similar evolutionary mechanism is indirectly indicated by the use of test coverage tools that allow developers to choose the degree of reliability that is acceptable.

At the grossest level is simple statement coverage, where every statement in a program has been invoked at least once to determine if it is initiating the right response. At the next level up is decision coverage, where every point of entry to and exit from a program is invoked and every decision has taken on all possible outcomes at least once, as well as for every control statement and every branch point. Beyond that is modified decision coverage, where the designer is required to know the impact of each of the variations discovered in the coverage process will have on the correct operation of the system.

There is nothing wrong with such differentiation, as its success in nature and in business illustrates. But in nature, the functions and features an organism inherits and determines how successfully it dominates its environment, also have a downside. If the environment changes too drastically or too quickly, it is dead meat. Evolutionary history is full of examples of this.

Right now the computer industry is in the throes of a fundamental change, as pervasive and swift as the changes that occurred to drive the dinosaurs into extinction. In the new net-centric computing environment, no computer is an island. Now every computer system, whether on a desktop, or embedded in a system, has the ability, via the Internet, to be connected to every other computer on a 24/7 basis. Now, not only is the reliability of each computing device of concern, but the reliability of the connections and the protocols, the reliability of the software that makes and manages such connections, as well.

Who will survive in this new environment? Will the large dinosaurs with large all-in-one programs and lower levels of reliability, but adapted closely to the desktop user survive? Which of the embedded organisms will make it? Will they move into the dominant positions, with death of some of the dinosaurs? What will be the level of reliability needed, that of the average automobile or that of the Rolls Royce? Will all-in-one programs, which do a lot of things sort of ok, be dominant, or will specialized, narrowly functional programs and devices that do one or two things well, move to the forefront?

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.