Hardware/software design requirements planning - Part 2: Decomposition using structured analysis - Embedded.com

Hardware/software design requirements planning – Part 2: Decomposition using structured analysis

Structured decomposition is a technique for decomposing large complex problems into a series of smaller related problems. We seek to do this for the reasons discussed earlier.

We are interested in an organized or systematic approach for doing this because we wish to make sure we solve the right problem and solve it completely. We wish to avoid, late in the development effort, finding that we failed to account for part of the problem that forces us to spend additional time and money to correct and brings into question the validity of our current solution.

We wish to avoid avoidable errors because they cost so much in time, money, and credibility. This cost rises sharply the further into the development process we proceed before the problem is discovered.

The understanding is that the problems we seek to solve are very complex and that their solution will require many people, each specialized in a particular technical discipline.

Further, we understand that we must encourage these specialists to work together to attain a condition of synergism of their knowledge and skill and apply that to the solution of the complex problem.

This is not a field of play for rugged individuals except in the leadership of these bands of specialists. They need skilled and forceful leadership by a person possessed of great knowledge applicable to the development work and able to make sound decisions when offered the best evidence available.

During the development of several intercontinental ballistic missile (ICBM) systems by the U.S. Air Force, a very organized process called functional analysis came to be used as a means to thoroughly understand the problem, reach a solution that posed the maximum threat to our enemies consistent with the maximum safety for the United States, and make the best possible choices in the use of available resources.

We could argue whether this process was optimum and successful in terms of the money spent and public safety, but we would have difficulty arguing with the results following the demise of the Soviet Union as a threat to the future of the United States.

At the time these systems were in development, computer and software technology were also in a state of development. The government evolved a very organized method for accumulating the information upon which development decisions were made involving computer capture and reporting of this information.

Specifications were prepared on the organized systems that contractors were required to respect for preparing this information. Generally, these systems were conceived by people with a paper mindset within an environment of immature computer capability.

Paper forms were used based on the Hollerith 80-column card input-output and intended as data entry forms. These completed forms went to a key punch operator. The computer generated poorly crafted report forms. But, this was a beginning and very successful in its final results.

This process included at its heart a modeling technique called functional flow diagramming, discussed briefly above. The technique uses a simple graphical image created from blocks, lines, and combinatorial flow symbols to model or illustrate needed functionality.

It was no chance choice of a graphical approach to do this. It has long been well known that we humans can gain a lot more understanding from pictures than we can from text. It is true that a picture is worth 103 words. Imagine for a moment the amount of information we take in from a road map glanced at while driving in traffic and the data rate involved.

All of the structured decomposition techniques employ graphical methods that encourage analytical completeness as well as minimizing the time required to achieve the end result.

While functional flow diagramming was an early technique useful in association with the most general kind of problem, computer software analysis has contributed many more recent variations better suited to the narrower characteristics of software.

U.S. Air Force ICBM programs required adherence to a system requirements analysis standard and delivery of data prepared in accordance with a data item description for functional flow diagramming. While functional flow diagramming is still a very effective technique for grand systems and hardware, it is not as effective for computer software analysis as other techniques developed specifically for it. So, our toolbox of analytical techniques should have several tools in it including functional flow diagramming and several others.

We should now return to a previous discussion. Encouragement has been offered for a single process repeated each time a program is undertaken in the interest of repetition and personnel capability improvement over time. Yet, here is another chance for process divergence to add to the different development environments offered earlier.

We have our choice on the degree of flexibility we will permit our programs. We can lock them down into the code “functional flow diagramming or die” as a way of encouraging continual improvement.

Alternatively, we can permit them to select from a wider range of approved decomposition practices in the interest of program efficiency as a function of product type and team experience. Figure 1.3-3 below offers a diagram for decomposition practices with one possible preapproval marked up.

Figure 1.3-3 Sample decomposition tool approval matrix.
The reality is that in 1998, the system development process was not a static, unchanging entity. Now, this evolution continues. Improvements continue to be made, and they will continue into the future until unified modeling language (UML) merges with SysML, which was derived from it, into a universal model. Until then, we must be alert to opportunities to plug in proven new techniques and integrate them into our toolbox and the skills base of our people.

As in the case of development environments, it is not necessary, or even desirable, that we use exactly the same decomposition method throughout the evolving system product entity structure.

In the beginning, we may choose to apply functional flow diagramming. As we begin allocating exposed functionality to things, we may allocate particular functionality to software. At that point the assigned team or team leader may choose to apply the Hatley- Pirbhai model, knowing that this will have to be developed as real-time software.

Another team may select enhanced functional flow block diagramming, knowing that they will use a system engineering tool called CORE, which is coordinated with that model. Some of these techniques are more appropriate to hardware and others to software, but the adherents of most of them will make a convincing argument why their preferred technique will work equally well for either product type.

Doing functional analysis
Figure 1.3-3 earlier lists several techniques for decomposition of large problems, one of which, functional flow diagramming, has been discussed earlier. It is the author’s preferred approach for grand systems because of its simplicity and generality.

This process starts with the need as function F, which is expanded into a set of next-tier functions, which are all things that have to happen in a prescribed sequence (serial, parallel, or some combination) to result in function F being accomplished.

One draws a block for each lower-tier activity and links them together in a sequence using directed line segments to show sequence. Logical OR and AND symbols are used on the connecting lines to indicate combinatorial possibilities that must be respected.

This process continues to expand each function, represented by a block, into lower-tier functions. Figure 1.3-4 below sketches this overall process for discussion.

Figure 1.3-4 Structured decomposition for grand systems and hardware.
A function statement begins with an action verb that acts on a noun term. The functions exposed in this process are expanded into performance requirements statements that numerically define how well the function must be performed.

This step can be accomplished before allocation of the performance requirement or function statement to a thing in the product entity structure or after. But, in the preferred case, the identification of the function obligates the analyst to write one or more performance requirements derived from the function and allocate that performance requirement to an entity to which it is allocated.

This is the reason for the power of all decomposition techniques. They are exhaustively complete when done well by experienced practitioners. It is less likely that we will have missed anything compared to an ad hoc approach.

This process begins with the need and ends when the lowest tier of all items in the physical product entity structure in each branch satisfies one of these criteria:

(1) the item will be purchased from another company at that level

or

(2) the developing organization has confidence that it will surrender to detailed design by a small team within the company and that the corresponding problem is sufficiently understood in either case that an adequate specification can be prepared.

There are two extreme theories on the pacing of the allocation process relative to the functional decomposition work. Some system engineers prefer to remain on the problem plane as long as possible to ensure a complete understanding of the problem.

This may result in a seven-tier functional flow diagram before anything is allocated. In the other extreme, the analyst expands a higher-tier function into a lower-tier diagram and immediately allocates the exposed functionality.

This selection is more a matter of art and experience than science, but the author believes a happy medium between the extremes noted is optimum.

If we accumulate too much functional information before allocation, we run the risk of a disconnect between lower-tier functions and the design concepts associated with the higher-order product entities that result.

If, for example, we allocate a master function for “Transport dirt” to an earth mover, we may have difficulty allocating lower-tier functionality related to moving the digging and loading device (which, in our high-level design concept, is integrated with the moving device). Allocation accomplished too rapidly can lead to instability in the design concept development process because of continuing changes in the higher-tier functional definition.

The ideal pacing involves progressive allocation. We accumulate exposed functionality to a depth that permits us to thoroughly analyze system performance at a high level, possibly even run simulations or models of system behavior under different situations with different functionality combinations and performance figures of merit values in search of the optimum configuration.

Allocation of high-order functionality prior to completing these studies is premature and will generally result in a less-than-optimum system and many changes that ripple from the analysis process to the architecture synthesis and concept development work.

Throughout this period we have to deal with functional requirements rather than raw function statements, so, when we do allocate this higher-order functionality, it will be as functional requirements rather than raw function names. Before allocating lower-tier functionality, we should allocate this higher-order functionality and validate them with preliminary design concepts.

These design concepts should then be fed back into the lower-tier functional analysis to tune it to the current reality. Subsequent allocations can often be made using the raw functions, followed by expansion of them into performance requirements after allocation Some purists would claim that this is a prescription for point designs in the lower tiers. There is some danger from that, and the team must be encouraged to think through its lower-tier design concepts for innovative alternatives to the status quo.

The big advantage, however, to progressive tuning of the functional flow diagram through concept feedback is that at the lower tiers the functional flow diagram takes on the characteristics of a process diagram where the blocks map very neatly to the physical situation that the logistics support people must continue to analyze.

This prevents the development of a separate logistics process diagram with possible undesirable differences from the functional flow diagram. Once again, we are maintaining process continuity.

The author believes the difference between functional flow and process diagrams is that the former is a sequence of things that must happen, whereas the latter is a model of physical reality. When we are applying the functional flow diagram to problem solving, we do not necessarily know what the physical situation is nor of what items the system shall consist. The blocks of the process diagram represent actual physical situations and resources.

The U.S. Air Force developed a variation of the functional flow or process diagramming called the IDEF diagram. IDEF is a compound acronym with the meaning “ICAM Definition,” where ICAM=Integrated Computer Aided Manufacturing Analysis. In addition to the horizontal inputs and outputs that reflect sequence, these diagrams also have inputs at the top and bottom edges that reflect controlling influences and resources required for the steps respectively.

This diagrammatic technique was developed from an earlier SADT diagramming technique developed for software analysis and applied to the development of contractor manufacturing process analysis. It does permit analysis of a more complex situation than process diagramming, but the diagram developed runs the risk of totally confusing the user with the profusion of lines.

Many of the advantages claimed for IDEF can be satisfied through the use of a simpler functional or process flow diagram teamed with some tabular data. These diagrams present a simple view that the eye and mind can use to acquire understanding of complex relationships, and the dictionary presents details related to the blocks that would confuse the eye if included in the diagram.

The IDEF technique has evolved into an IDEF0 for function analysis, IDEF1X for relational data analysis, IDEF2 for dynamic analysis, and IDEF3 for process description.

Some system engineers, particularly in the avionics field, have found it useful to apply what can be called hierarchical functional analysis. In this technique, the analyst makes a list of the needed lower-tier functionality in support of a parent function.

These functions are thought of as subordinate functions in a hierarchy rather than a sequence of functions as in flow diagramming. They are allocated to things in the evolving architecture generally in a simple one-to-one relationship.

The concern with this approach is that it tends to support a leap to point design solutions familiar to the analyst. It can offer a very quick approach in a fairly standardized product line involving modular electronics equipment as a way to encourage completeness of the analysis.

This technique also does not support timeline analysis as does functional flow diagramming since there is no sequence notion in the functional hierarchy. Ascent Logic has popularized another technique, called behavioral diagramming, that combines the functional flow diagram arranged in a vertical orientation on paper with a data or material flow arranged in the horizontal orientation.

The strength of this technique is that we are forced to evaluate needed functionality and data or material needs simultaneously rather than as two separate, and possibly disconnected, analyses.

The tool RDD-100, offered by Ascent Logic until its bankruptcy, used this analysis model, leading to the capability to simulate system operation and output the system functionality in several different views including functional flow, IDEF0, or n-square diagrams. Behavioral diagramming was actually a rebirth of IPO developed for mainframe computer software development. It included a vertical flow chart and a lateral commodity flow that could be used for data (as in the original IPO) or material acted upon by the functional blocks.

Another system engineering tool company, Vitech, developed a tool called CORE that uses a similar diagramming treatment called enhanced functional flow block diagramming, with the functional flow done in the horizontal and the data or material flow in the vertical axis.

These two techniques and IDEF0 are two-axis models that permit a much richer examination of the problem space than possible with simple functional flow diagramming, but they all suffer from diagram visual complexity that makes it more difficult for the human to move the image and its meaning into his or her mind through vision.

Whatever techniques we use to expose the needed functionality, we have to collect the allocations of the performance requirements derived from that functionality into a hierarchical product entity block diagram reflecting the progressive decomposition of the problem into a synthesis of the preferred solution.

The peak of this hierarchy is the block titled system, which is the solution for the problem (function) identified as the need. Subordinate elements, identified through allocation of lower-tier functionality, form branches and tiers beneath the system block.

The product entity structure should be assembled recognizing several overlays to ensure that everyone on the program is recognizing the same structure viewed as a work breakdown structure (finance), manufacturing breakdown structure (manufacturing assembly sequence), engineering drawing structure, specification tree, configuration or end item identification, make-buy map, and development responsibility matrix. As the product entity structure is assembled, the needed interfaces between these items must be examined and defined as a prerequisite to defining their requirements.

These interfaces will have been predetermined by the way that we have allocated functionality to things and modified as a function of how we have organized the things in the architecture and the design concepts for those things. During the product entity synthesis and initial concept development work, the interfaces must be defined for the physical model using schematic block or n-square diagrams

To read Part 1 , go to:Doing the groundwork
Next in Part 3: Performance requirements analysis

Jeffrey O. Grady is president of JOG SystemsEngineering, Inc. and Top of Form Adjunct Professor, University of California,San Diego, Ca. He was a founding member of the International Council on SystemsEngineering (INCOSE).

Used with permission from Newnes, a division of Elsevier.Copyright 2011, from “System Verification” by Jeffrey O. Grady. For more informationabout this title and other similar books, please visit www.elsevierdirect.com.

This article provided courtesy of Embedded.com andEmbedded Systems DesignMagazine. Sign up for subscriptions and newsletters

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.