When computers were human

August 03, 2010

Jack Ganssle-August 03, 2010

Computers were not always digital. Nor were they always analog or mechanical. For far longer than electronic data processing has been around, the word "computer" described a human whose job was to make calculations.

A book by IEEE Computer columnist David Alan Grier named When Computers Were Human (Princeton Press, 2005) gives a marvelous history of the precomputer computer. Like Gaul, the book is divided into three parts:

  • 1682 to 1880 when nearly all computation was devoted to astronomy.
  • 1880 to 1930, when some machinery, like mechanical calculators, became available, and computation found acceptance in many other fields.
  • 1930 to 1964, when "computers" became an independent discipline with professional standards.

It starts off with the expected 1785 reappearance of Halley's Comet. No one was quite sure when it would show up, but scientists accepted that Newton's Laws were in the driver's seat, and therefore the comet's orbit should be predictable. The motion of the comet was a classic three body problem, which is not amenable to analysis; numerical means were required to figure the contributions of the then-known outer planets to the comet's orbit around the sun. Three friends worked together at a kitchen table for almost two years calculating the comet's position at a particular time, then advancing the body a few degrees in its orbit and recomputing where it would be next.

In November, 1757 their calculations were complete. The trio predicted that Halley's Comet would reach perihelion the following April 15 plus or minus 30 days. The actual result: March 13. Not bad, but later analysis showed they had a happy confluence of self-canceling errors. (Uranus and Neptune had not been discovered, which skewed the results as well.)

Probably the biggest contribution the three made was the division of mathematical labor. Each worked out one portion of the calculation for every iteration of the comet's position. That parceling of work, which the industrial revolution brought to manufacturing shortly thereafter, was the insight needed to handle the much more complex problems tackled by human computers later. And ironically, that is the very problem we haven't solved in the modern multicore era.

Over time other needs surfaced. As is all too common, wars fueled computers' work. Shell trajectories were dependent on numerous factors like the barrel's elevation, wind direction and speed, barometric pressure, and more. Soldiers under fire are neither competent to do advanced math nor do they have the time, so armies of computers created ballistics tables. ENIAC, arguably the first electronic digital computer, was created to compute these tables as new versions were always needed to match advances in gunnery.

< Previous
Page 1 of 4
Next >

Loading comments...