Advertisement

The untapped tools market

November 02, 2015

Jack Ganssle-November 02, 2015

I recently did a survey of firmware development processes. The results are published in my newsletter here.

One interesting result was that many developers don’t use some of the tools that have been proven to be effective.

Consider Lint. It has been around for decades. Lint is a syntax checker on steroids – it will find things your compiler will never identify. Lint will work across modules to identify errors. 

Yet only a quarter of us use it on “most modules.” 62% rarely use it at all.

Can you spot the error in the following code? (This is from Gimpel’s old “Bug of the Month” series, and is used with permission).

1    #include 
2    #include 
3    #include 
4
5    char *obtain_data()
6        {
7        char *buf = malloc( 100 );
8        FILE *f;
9        size_t length = 0;
10
11       if( !buf ) return NULL;
12       f = fopen( "football.dat", "r" );
13       if( !f ) return NULL;
14       *buf = '\0';
15       while( fgets( buf+length, 100, f ) )
16           {
17           length = strlen(buf);
18           realloc( buf, length + 100 );
19           }
20       (void)fclose( f );
21       return buf;
22   }

The answer is here.

Of course you can. So can Lint. The difference is that you took at least a few seconds. Or maybe minutes. Lint finds it in a millisecond. Lint examines tens of thousands of lines of code in seconds. How long does a similar analysis take you?

Like any tool, Lint takes time to master. But if you’re in a sinking boat, it’s generally better to fix the leak before bailing. Otherwise you’ll never stop bailing.

Lint is cheap at $389.

Then there’s static analysis – the use of tools that will find bugs that normally would surface at runtime, like a buffer overflow. Of those who use these tools, 38% report they “provide real value.” Another 52% get “decent but not stellar results.” Only 10% say they are a waste of money. I suspect if one polled about the virtue of using a hammer to drive nails there would be 10% who consider the hammer a waste of money compared to using one’s forehead.

Despite the success of these, 70% of us never use the tools. Another 12% only use them on some of the code.

Part of the reason may be expense; a static analyzer will run you about the cost of a Toyota. However, ever more of the embedded space is being regulated (medical, avionics, etc.) and as our systems control more of the world, more will be safety-critical. Often the regulations mandate static analysis.

Cyclomatic complexity tools aren’t used, either, at least by most of us. Measuring and controlling function complexity reduces risks, and gives a quantitative bound one can measure the efficacy of testing. But 75% of respondents don’t measure it at all.

One can spend any amount of money for a complexity tool. Or, there are zero-dollar solutions (for instance this one). So price certainly isn’t a barrier to adoption.

The non-use of these tools is similar to the use of metrics. About 60% don’t collect any metrics about the software engineering process. Of those that do, more than half never use the data. I think engineering is about numbers; we need to collect them to understand how we can improve. But what’s worse than not measuring anything is taking data and ignoring it.

The bottom line is that there is an enormous untapped market for tools. Vendors could be rolling in the dough if they were more effective at marketing their products.

Software tools like these are hard to justify to the boss. Most vendors do publish case studies, but absent controlled experiments it’s hard to prove an expensive purchase will save money. They sell to a tough crowd: Engineers want proof. All are skeptical of a salesperson’s vague hand-waving. It’s hard, though, to create a number of controlled experiments where a real-life (as opposed to a toy) system is built by similarly-skilled developers both with and without a particular tool.

That’s one reason for surveys. They are not tremendously accurate as people read questions in different ways. But a survey is like an impressionistic painting. It conveys out-of-focus images that reflect reality.

What’s your take on tools like these?


Jack G. Ganssle is a lecturer and consultant on embedded development issues. He conducts seminars on embedded systems and helps companies with their embedded challenges, and works as an expert witness on embedded issues. Contact him at jack@ganssle.com. His website is www.ganssle.com.

 

Loading comments...