The practice of engineering is often described as an art form. I think it's the art of making scientific tradeoffs. As scientists with a practical, rather than academic or theoretical, focus, we're challenged to build things on the basis of information at or near the boundaries of human knowledge.
In virtually all our endeavors, there are unknowns, subtleties, and complexities over which we exercise limited control. The cost in engineering time and resources to fully comprehend everything about a system is in some cases unbounded; such a thorough analysis is at the least often cost prohibitive. If the product works, we can't generally afford to do much more than ship it and move on to the next project.
Just as tradeoffs are made in the area of features or implementation techniques, so too are they made in the area of knowledge. It's rarely possible to build a saleable, profitable product while completely understanding all the implications of our numerous design and implementation decisions.
The designers of each system must decide how much time and money to spend investigating the dark corners. Those designing pacemakers and airplanes, for example, are responsible to shine the light of knowledge brightly in all corners of their designs; whereas the designers of consumer electronics can leave more to chance.
Some areas of engineering that are not profit driven also suffer from the need for thorough analysis. Manned missions to space are of this nature. Engineers working at NASA have made tremendous efforts to understand all the complexities and potential failure points of the space shuttles. Unfortunately, the amount of work to be done is infinite; these systems have millions of individual components and operate in unforgiving and poorly understood environments. And there's only limited time to show results.
As the losses of the Challenger and Columbia have demonstrated, sometimes it's a part of a design once thought to be understood that's actually the most dangerous. In both shuttles, similar previous failures had been observed, documented, and discussed by engineersyet the true danger those failures posed was not fully comprehended until after each catastrophe struck.
When individual components fail they can take even the most carefully-designed systems down with them. Such failures sometimes also take the lives of people. Catastrophes like this are, unfortunately, likely to occur more often as people rely increasingly on technological solutions to everyday problems.
I don't blame the engineers at NASA for the loss of either shuttle; in both cases, they knew there was a problem but had too many other, seemingly more important concerns. But all engineers everywhere should learn from such high profile failures. Ask yourself these questions: What is the true source of the problem in your system? What danger does it pose? How can you overcome organizational challenges to see the proper solution through?