Harry Potter doesn't code (but Murphy does) - Embedded.com

Harry Potter doesn’t code (but Murphy does)

Computers have become so cheap and powerful that almost any system — mechanical, electrical, or chemical — probably needs to involve them, which means software has to be written.

Unfortunately, there's been a habit of inventing pretentious titles for programmers, particularly “software engineer.” Calling fiddling around with websites “Software Engineering” is not Millennial, it's delusional. (“Fiddling” is the least offensive term I could find for this activity; I was tempted to be much ruder.)

I firmly believe that the term “software engineer” should be reserved for people who work on compilers and other system software, or include software in real engineering projects (I suppose that, at a sufficient scale — Google, for example — websites might qualify).

The point is that a real engineer's mistakes potentially kill people or cause substantial damage. Problems of software development are quite well understood, but — unfortunately — mostly by people no longer working in the industry. That's why these problems keep on being repeated. Most of them are caused by the limitations of human psychology, and the classic work on that — The Psychology of Computer Programming — by Gerald M. Weinberg dates back to 1971.

Readers of this article will probably know Arthur C. Clarke's aphorism: “Any sufficiently advanced technology is indistinguishable from magic.” Similarly, any sufficiently obscure problem will tendto generate “magic thinking” about its possible solution.

There's much current hand-waving about “coding” (mostly by people who wouldn't recognize code if it bit them on the ankle), but this is actually about the least important part of the project's effort. Code has to work, at a minimum, but generating working, maintainable, useful code, in a reasonable timescale comes from good problem definition, organized development processes, and sensible management.

A fairly new practice that will help define project scope, improve specifications, and aid with quality assurance is called “test-driven development.” The idea is that, before writing a line of code, you should set out in as much detail as possible the cases the system will have to deal with, and what the results should be.

One thing that tends to get overlooked is the percentage of the effort that will have to be devoted to extreme conditions (e.g., errors), which is likely to outweigh the work associated with the normal cases.

At first, the system will fail every test, but as development proceeds, fewer and fewer tests will fail. When none of them do, the work is finished (for the original definition of “finished”). Ideally, automate the process.

Anyone who's been in the trade for a while knows that there are unlikely to be any silver bullets in the way of programming languages (except for specialized problems, where domain-specific languages have been developed, like R for statistics). For “silver bullets,” see Fred Brooks' The Mythical Man-Month .

The best language to use for most problems is one known by someone who understands the problem thoroughly (provided the language is not utterly obscure, which can lead to future problems with maintenance). Just beware of amateur development procedures.

Ever since the days when scientists swapped trigonometric functions on paper tape, the key to programming productivity has been doing as little of it as possible. Any processes that can be defined in a generic fashion should be stored in subroutine or module libraries to specialize the core language.The module library resource provided in CPAN is one of the reasons Perl can be so powerful, and this idea has spread to other languages as well.

I once worked with a business software developer who coded in Assembler and who reckoned that he could produce software faster than if he used COBOL. He was able to generate systems quickly because he had most important functions for standard tasks stored in thoroughly tested modules. The only custom code required was to combine and specialize these modules for each client. He essentially developed his own language for system generation. It's great if this emerges as a by-product of other development, but resist any temptation to divert the project to creating a new language (academics are susceptible to this), unless your name is Brian Kernighan, of course.

For want of a better word, “architecture” is key to complex systems that work. Read Herbert Simon's classic Sciences of Artificial and you will probably start recognizing how many natural processes essentially function in layers. They are most easily assembled from simple components that do one job, and talk to each other in a standard way.

Proverbs from other fields of engineering, like “Simplicate and add lightness,” are equally relevant to software. The cheapest, lightest, most reliable component in a system is the one that is not there. If a service is not running, it can't be compromised. Unfortunately, simplicity is easy to recognize and understand, but hard to achieve. Consider security at the start, and bake it into the design. Any attempt to bolt it on afterwards will produce an expensive and vulnerable kludge.

Rather than appreciate and preserve elegance, the software industry has a tendency to pile Pelion on Ossa, with ever more shininess to attract money and obscure function (object orientation is a fine example, as is the proliferation of web frameworks and IDEs).

A case study of how simple stuff accretes functions over time is the history of my favorite editor. Once upon a time, children, computer users worked at noisy terminals, which had keyboards and printed on continuous paper. That meant that everything that was entered or output in a session was visible.

A simple line editor called ex, made life (barely) tolerable in that environment. Then “glass teletypes” came along, and everything scrolled off the top of the screen. A very clever man called Bill Joy wrote a “Visual Interface” (vi) to ex, that made life with glass teletypes tolerable again. Bill wrote it in a bit of a hurry, so it was far from perfect, but it was good enough. Many people worked on vi, (or clones with cute names like elvis), improving the code and adding features.

One of the developments was called vim (for “Vi Improved”). Over time, vim has evolved and acquired a full-featured scripting language called vimscript. It can now be used as a presentation tool including simple animations. I'm not sure if it has fulfilled the prophesy that all software ultimately acquires the ability to read Net news. Fortunately, Bram Molenaar, the original author, has exercised enough self-restraint that it's still a decent editor.

Alan Rocker started working as a trainee programmer on the LEO III, a machine produced by English-Electric-LEO-Marconi, which had the misfortune to be in direct competition with the IBM 360 range. (The remains of the computer business were taken over by ICL, which got eaten by Fujitsu.)

The route to riches in those days was projected to be the “software house,” which was much like today's startup scene. When Britain switched from pounds, shillings, and pence to a decimal currency, all the first and second generation systems had to be rewritten. This produced a boom in demand for programmers in 1970 (next seen in the years leading up to Y2K), but the subsequent recession was painful.

After some years working on mainframes and around software packages (the industry had realized that package sales scaled better than body shops), Alan switched to consulting. Around that time, minicomputers and then microcomputers began to be useful, especially when connected (evenat 300 baud). Alan investigated them and began the transition to Unix.

Working as a system administrator in the 1990s, just as the Web began to change the landscape again, Alan ran across Perl. Teaching that and Unix/Linux have occupied most of his time since then.

An interest in development productivity, which started back in the 1970s, has led Alan to the conclusion that there are very few developments in the business, but there's a lot of hype and repackaging. That's mostly because management's behavior (as opposed to pronouncements) makes it clear that they don't actually care. Sadly, he says, the same is true of security.

3 thoughts on “Harry Potter doesn’t code (but Murphy does)

  1. “As someone who writes firmware for certified safety critical devices, I applaud your statement regarding the broad and loose use of the term “Software Engineer”…”

    Log in to Reply
  2. “In defence of the software bod I would like to add that often their first need of requirements uncovers the businesses lack of a *documented* process that is *followed*. If you were to give what was asked for then it could not be used.”

    Log in to Reply

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.