How embedded projects run into trouble: Jack’s Top Ten – Number Seven
Those who don’t learn from the past are doomed to repeat it, and for the last several weeks I’ve been itemizing the top ten ways I see embedded projects fail.
Pride is the seventh deadly sin, and pride informs my number 7.
7 - Bad science
When a developer snorts that he knows all about the chemistry his system will monitor, that pride goes before the fall. For unless we really understand the science our products use, something bad will happen.
In the 1970s we built a system to measure protein in grains (like wheat) using IR light. Spinning filters created 800 distinct wavelengths which impinged on the sample. Lead sulfide detectors read the reflected signal which was digitized and handled by an 8008.
Did you know lead sulfide detectors are more sensitive to temperature variations than to light? We didn’t. The data was garbage, and for months we sought answers before learning this simple fact. More months were lost as we tried cooling them with Peltier plates… and they got too cold. The final solution was a combination of carefully-controlled Peltiers and heating elements.
Then, about the same time, we were building a machine to measure the oil on nylon as it was being manufactured. This involved putting a sample in a cup full of carbon tetrachloride to dissolve the oil (which was then measured with IR light). We didn’t know that carbon tet is quite toxic when inhaled; OSHA nailed us. Dry cleaners had switched to perchloroethylene years before, so we did the same. The data was now meaningless – we couldn’t measure any oil at all. Did you know most perc has some alcohol in it, and alcohol is somewhat opaque at near-IR frequencies? We knew neither of those things, and spent considerable effort before the customer casually mentioned it, as if this was a fact everyone knew.
Those were both examples of bad science.
Noise torments us. There’s a signal, for sure, but it’s masked by all kinds of mechanical, sample and electronic creaks and groans. Can you extract it from the noise? Not knowing is akin to bad science. Filters are swell but might distort or delay the signal to an unacceptable degree.
We can build systems of extraordinary sensitivity. The LIGO gravity wave detector can measure variations in length to 1/10,000 the diameter of a proton. At a cost of $1.1B. Amazing for sure, but not a marketable product. In the real world we have sensitivity limits that must be carefully considered.
Due to NDAs I have to be vague, but many times I’ve seen teams stymied for years because the scientists are still tweaking the detector. Or the chemistry. Or one of a hundred other things. Then there are the groups that are writing code while simultaneously exploring the science, finding motors don’t move properly due to poorly-understood control laws or the jets get clogged as no one knew the fluid being dispensed coagulates.
A surprisingly common problem occurs when many data points are read, and a result computed by solving a polynomial. That equation’s coefficients were determined by a calibration: measuring many known samples and then using a least-squares regression. That works! Sometimes. But with a narrow-enough range of expected outputs and a sufficiently long polynomial, you can correlate anything to anything, even when there is no innate correlation.
Or, there was the group building a system that mixed a number of chemicals to create a reagent. They didn’t know that in certain concentrations the mixture could be explosive.
The fire was costly, made worse as there were no off-site backups of the code.
Bad Science is a mistake similar to my Top Ten Number 9: Jumping Into Coding Too Quickly. Poor requirements are a big component of both, but number 7 is in an important way about hubris. We KNOW this stuff; there’s no need to do any research. How hard can it be?
Turns out, often pretty darn hard.