The really early days of computing - Embedded.com

# The really early days of computing

 Let's start at the very beginning This column is based on an earlier article that appeared in Micro Cornucopia magazine, ca. 1986. The editor had asked us to offer stories of how things were in “the early days of computing.” I expect he meant, “the early days of microcomputing,” but I elected to delve back even further in time. This is an updated and greatly expanded version of that article.

I don't know how to tell you this — some readers may find it disturbing — but we haven't always had PCs. We haven't even always had microcomputers. Heck, there was a time when we didn't have computers at all! If anyone needed to compute something, we did it the old-fashioned way: with pencil and paper.

An elegant mathematical proof can be a thing of beauty forever, but when it comes down to the important stuff, like the position of a farmer's property lines, the amount of wine in a barrel, or the trajectory of a moon rocket, scientists need numbers. Computing those numbers is the craft of the applied mathematician. The computations may be based on the most complex of mathematical analyses, but in the end, they boil down to simple arithmetic. We don't usually need the 15-digit sort of accuracy we computer types have come to expect — Kepler would have killed for those — but we do need at least four or five digits, or else the result can get lost in the numerical noise.

Adding a column of five-digit numbers is easy enough — my father could do those in his head — but multiplying them is quite another matter. If you have more than a few products to compute, the process can be painful in the extreme and fraught with error. Early astronomers often spent months — even years — calculating the orbit of one comet.

So how do you multiply lots of five-digit numbers? Applied mathematicians have known the secret since the 1600s. In my high school math classes, they taught us the secret: logarithms. A goodly portion of my algebra class was spent teaching us how to read and interpolate a table of logarithms, and how to manipulate them to get an answer.

The first step in understanding logarithms is to recognize that, when we raise a number to a power, that power need not be an integer. We all know that:

(1)

But somewhere between that 1 and the 2, say, there must be a different power such that:

(2)

That number happens to be:

(3)

For any positive and nonzero number x , there's a value p such that:

(4)

This value is called the logarithm (log, for short) of x , and we write:

(5)

The logs of various numbers are hard to compute by hand, but the computation only needs to be done once for each number or, more precisely, a lot of numbers that are close together, then tabulated for the rest of us. We've had tables of logarithms since the early 1600s. Between the tabulated points, we interpolate (remember proportional parts?).

The second key lies in the relationship:

(6)

In terms of the log functions:

(7)

Given two numbers x and y , we can get their product by adding their logarithms. It may seem a roundabout way of doing things. We must access the log table three times, once to get p , once to get q , and a third time to get the inverse log (antilog) of the product. Even so, applied mathematicians preferred this approach because adding is an easier and safer operation than multiplying.

 Log(x) is not logarithm base 10 High-order computing languages like Fortran, C, and C++, all have a function called log(x). Unfortunately, confusingly, and most perversely, this is NOT the logarithm base 10, but the natural log, which mathematicians call ln(x). The natural log uses base e = 2.718282… Yet another example where compiler writers got things wrong. Check your own environment to be sure what the function is giving you.

My trig book had tables that gave not only the values of the trig functions, but of their logarithms as well. The logs were used most, because trig functions tend to multiply things. The tables were typically given to the nearest tenth of a degree, which certainly seemed to be enough for anybody.

Needless to say, solving a relatively simple trig problem was not a simple matter … it took lots of time and patience, and errors were easy to make. Doing something like calculating the points in a single 3-D drawing would have been an overwhelming task.

We had another approach to problems like that: graphics. A drafting class was always considered required for anyone planning a technical career. It was there that we learned how to handle T-squares and triangles, and how to keep our pencils needle sharp without breaking them (you use sandpaper).

Higher education
College was more of the same. I'll never forget that first day in the college bookstore, where I was outfitted for a career in engineering. I watched, bug-eyed, as the clerk stacked up my standard-issue equipment: a set of drafting instruments, a drawing board, T-square, triangles and French curves, and … (wait for it) that most wonderful of all calculating instruments … the slide rule .

Figure 1 The slide rule.

Editor's note: Picture from Highlights from The Computer Museum Report, Volume 18 — Winter 1987, the Computer History Museum of Boston, now in Mountain View, California. This picture and many more are posted on Ed Thelen's website.

About a week after inventing logarithms, Napier realized that he could mechanize the process of adding them. You can do the same thing with ordinary numbers. Figure 2 shows how. Take two rulers, and place them one above the other as shown. Find the point on the lower ruler that corresponds to a number — call it x . Slide the upper ruler so its zero mark lines up with that point.

Now look along the upper ruler until you find the second number, y . Look below it on the lower ruler, and there's your sum.

The slide rule works the same way, only the things you're adding are logarithms. The scales are inscribed, not with the logs, but the numbers associated with them. Think “log scale” on an Excel chart, and you'll see what I mean. Other scales included the trig functions and log tables, thereby rendering the books of tables (almost) obsolete. The K & E slide rule had all that inscribed in 10 inches of porcelained bamboo. To multiply, divide, take square roots (or any other root or power, for that matter), and solve trig problems, all you had to do was to manipulate the slider and hairline “cursor” on that magic instrument.

My first technical class in college was a course in how to use the slide rule. The slide rule never left our collective sides, housed as it was in a scabbard hanging from our belts like prehistoric light sabers.

The one operation the slide rule couldn't do was to add/subtract. For that we still had to do things by hand, although in graduate school I finally got a neat little pocket adding machine (based on a design by Pascal), that helped immensely. As a sidelight, I entered a sports car rally as a navigator (the “sports car” was a custom '41 Chevy). We won, thanks to the invincible computing power of my slide rule and adding machine.

As the years progressed, so did our proficiency with the slide rule. Our performances and grades in our classes depended upon it, and we studied it earnestly. Before a quiz, we would carefully adjust it like a soldier cleans his rifle before a big battle. Those three strips of bamboo had to be spaced just right, to slide freely but still stay where they were put. We actually lubricated them with talcum powder for maximum speed without overheating (!). You could always tell the guys who were serious about their grades, by the talcum powder stains on their shirts.

Our skills in graphics were sharpened, too. In those days, the worth of an engineer depended just as much on his ability to draw a straight line or to plot a graph, as on his “book-larnin.” Many problems that we now do by computer were solved in those days with graphs and nomograms. One of my favorite courses was Descriptive Geometry. There, we learned to do all kinds of magical things with a pencil and T-square.

Example 1: Suppose you're given two points in three-space and need to know how far apart they are. You can apply the Pythagorean theorem twice, to get:

(8)

Or, you can draw the line in front, top, and side views. Now project it in the right direction, and you're looking at it side-on. So just measure it.

Using descriptive geometry, we solved real-life problems, like estimating the volumes of cuts and fills on a highway or the distance that a power cable would miss a hillside.

One day, our prof gave us four sets of coordinates defining two straight lines skewed in 3-space. The problem was to find the miss distance between them. You did this by generating projections of the lines onto various planes until one of them appeared end-on, as a point.

While I was working diligently on the problem, Francis Pugh, who was smarter and faster than the rest of us, announced that the distance was zero; the lines intersected. “That's impossible,” shouted the prof. “I picked those points at random. Do you realize what the odds are that I would randomly pick a pair of lines that intersect? The distance may be small, but it can't be zero. Go back and do it again.”

Now Francis was smart, but he wasn't a politician. He said, “I don't care what the odds are. I've done it right the first time, and the lines intersect !”

As the conversation got more and more heated, it was clear that Francis was in deep trouble. He was winning the argument but losing the war, and the prof's face was getting redder and redder. As the two elevated the argument to a shouting match, they were too engrossed to notice that, all over the classroom, the rest of us were quietly erasing the lines that we, too, had by this time found to intersect. One of the nice features of solving things graphically was that you could always warp the lines a bit when it seemed prudent.

Into the space age!
After college, I went to work for the space agency, NASA. I was going to help put men on the moon (which I did). My first day, I received the two tools of my trade: an 18-inch government-issue slide rule and a book of five-place trig tables.

See, NASA figured that the three-digit accuracy of the standard 10-inch slide rule just wouldn't cut it for space travel. In general, to get one more digit of accuracy you need a slide rule 10 times longer. But it just happened that the 10″ rule could almost get four digits (it could, over part of its range), so that increasing the length to 18″ was just enough to get that precious extra digit.

Even more exciting, NASA had real desktop calculators ! Electro-mechanical ones, which were sort of adding machines on steroids.There were a number of brands around. Ours were Fridens. The Friden was a huge machine by today's standards — as big as an old standard typewriter and much, much heavier. Inside were hundreds of little gears and levers that would drive a Swiss watchmaker into paroxysms of ecstasy.

Photo from Old Computer Museum .

The Friden worked much like the adding machines used for businesses, except it would multiply and divide, as well as add and subtract. There was a keyboard having 10 columns of 10 digit keys, and a carriage like a typewriter. On the carriage were numbered wheels that spun. You typed a number in by punching (that's the right word — no electronics or power-assist here) one key in each column, and then punched the “go” key. To the accompaniment of the noise of a threshing machine, the carriage slewed, the wheels spun, and in a matter of decaseconds, there was your answer. Division was quite a sight to behold, and on those few machines that could do square roots, the noise level rose alarmingly as more and more wheels got into the act.

But most of us didn't have access to the square root machines, and a measure of your proficiency with a Friden was how quickly you could find a square root on a non-square-root Friden. There was a neat algorithm for it that I've never forgotten (no, it's not Newton's method, it's an exact, noniterative algorithm). I also learned the famous “Friden March,” a calculation that caused the carriage to chunk along in a neat, “rah, rah, rah-rah-rah” rhythm.

Despite the horrible noises emitted by the Fridens — sounds reminiscent of the clashing of a nonsynchromesh truck transmission — it always gave reliable answers. Over a period of five or six years, I never once saw a Friden give a wrong answer.

The big advantage of the Friden, other than its tendency to get the correct answer, was that we could calculate to as many as 10 digits of accuracy — unheard of until then. But most of our calculations were done to only five digits or so, because that's as many as were in the trig tables. Later I managed to get a book of six-place tables. It was a big book.

Looking back upon the space race and all the high-tech things that were involved in it, helps to remember that, at least through projects Mercury and Gemini, the work was mostly done with slide rules and Fridens.

As in college, a lot of our output in those days were graphs, and again a large part of our skill was our ability to plot microscopic dots at the right places on a piece of log-log paper, and then fair a smooth curve through them. Back in college, we tended to plot graphs with anywhere from three to seven points on them, but NASA needed much more accuracy. So a lot of our time went into calculating the data for the many points to be plotted. And that's where I learned about spreadsheets .

These were the original spreadsheets, of course — real sheets of 14″ x 17″ paper, ruled into rows and columns. What we did was to organize the “input” data into one or more columns. Each subsequent column involved operations on previous columns. If you've ever used Excel, I don't have to explain further.

For simple problems with not too many entries, and for problems not needing great accuracy, we would use the good ol' 18″ slide rule. For more complex problems, we would use the book of trig functions and the Friden.

For really long problems, we used Donna.

Donna was the department secretary, who doubled as a calculator operator. Donnasupported some seven engineers, and had the patience of Job. She would sit there all day, day after day, crunching out those numbers, which we would then plot up and analyze. Donna prided herself on using 10-digit accuracy for everything, even if the input data was only good to three digits. If any errors were going to be introduced, it wasn't going to be at her end.

One day I got a really big problem — one that seemed too big even for Donna to deal with, considering her other duties. I asked a colleague, “What do you do with problems that are too big for Donna?” He said, very matter of factly, “Oh, you take them to the Computer Room.”

You should have seen my eyes light up. I had been reading all about the “Giant Brains” — had even learned to program one in college, though I never saw it (the school didn't actually own one). I couldn't wait to see how the folks in the Computer Room dealt with my problem. Eagerly I got directions from my colleague, prepared my data and rushed over to the building he described. Following his directions, I walked down the hall until I arrived at a set of double doors, with a large sign proclaiming, sure enough, “Computer Room.” From the other side of the door came a satisfying clattering of high-tech machinery, hard at work.

Holding my breath, I eased open the door.

Inside was a huge room. There must have been 300 desks, all arranged neatly in rows. At each desk sat a woman, and on each desk was a Friden. The women were the computers! I kid you not (sorry, no men were there).

I found out later that their official job description was “GS-2, Computer.”

In this JPL photo, notice that the “computers” ran the Fridens with their left hands, handled the paperwork with their right. The lady in the center of the photo is filling out a geen-you-wine spreadsheet. To view the full-size image, this link takes you to the photo and article on JPL and NASA’s web site.

Once I had gotten over the shock, I approached the “head computer.” She explained to me how things worked. You used the same spreadsheet format we used — and still use today — except that each column only involved a single math operation. If, for example, the first two columns were the inputs, x and y , then the header for column three might read:

(9)

After all the calculations were defined, you turned things over to the computers who filled the numbers in. The foreperson assigned different parts of the job to different women, depending on the load. For a really big job, she would keep several parts running in parallel. The first multitasking, multiprocessing computer system, I suppose.

It all actually went quite smoothly. The computers rarely made a mistake, and they used redundant calculation to catch any errors. They would even plot the results up for me, although I rarely used that service. My boss grumbled that they used dull pencils and didn't know how to interpolate. He and I both found that we could plot more accurately.

Progress is wonderful
Well, things didn't stay that way for long. We eventually did get real computers (the nonhuman kind): the IBM 701, followed by the 702, 704, 709, 7094, etc. By this time I was at a different job, I had learned how to program in FORTRAN and some other very peculiar languages, and I was building a bit of a reputation as a computer expert. I had built some pretty slick simulation programs, and we were flying imaginary spacecraft all around a simulated moon. The number of computations we performed in a day would have taken those human computers their whole lifetimes, multiprocessing or not. It was an exciting time. (For the record, I once calculated that I could have generated every one of the trajectories I'd done in four years, 10,000 times over, in the time it takes a modern PC to boot. We'll discuss why it takes a modern PC to boot, another day).

In those days, I still had my slide rule (the 10″ one — NASA made me give back the 18″ one and the six-place trig tables when I left). I even had a 6″ rule for my shirt pocket, and a 1 1/2″ one as a tie clasp, for emergencies. But the slide rule got used less and less as better ways came along.

One day a colleague whom I'll call John came to see me. He said, “Jack, I've come up with a neat computer program that I'd like you to take a look at.”

“OK, John,” I said. “What does it do?”

“Well,” he replied, “Remember back in the good old days when we had to do computing by hand? Remember the way we used to make up those spreadsheets and turn them over to the computer ladies?”

I acknowledged that I had. We spent a little time congratulating ourselves for our progress, at having gotten away from such primitive methods.

John said, “Well, I've developed a computer program that works the same way. All you have to do is to define the formulas for each column of the spreadsheet and give the data. The computer does the calculations just like the computer ladies used to do and gives you a printout that looks just like a spreadsheet. I think it'll be just the ticket for those people who don't know how to program in FORTRAN. It will open up the use of computers to lots more people.”

I thought about it for all of 30 seconds, and said, “John, that's the dumbest idea I've ever heard.”

There was a moment of silence as John absorbed what I had just said. The sparkle in his eyes dimmed a bit. Crestfallen, he whispered, “Why?”

Now, in my defense you have to understand: in those days we were taught that computer time was precious — \$600 per hour, at a time when \$600 would buy more than a ticket to a rock concert. It was important, we were told, to keep the CPU busy doing productive work at all times. It was considered far more cost-effective to waste engineers' time than computer time.

So I explained, “John, now that we have electronic computers, we have to learn to do things their way. Anybody who plans to be an engineer in the '60s is going to have to learn to speak to computers in their language. You and I have learned to program so we can do that. What you're trying to do is to ask the computer to make up for the deficiencies of the engineer. You're forcing the computer to do extra work, just because the engineer is too lazy or too dumb to learn the computer's language. You're never going to sell an idea that uses a computer so inefficiently!”

As I spoke, you could see John slowly fall apart. His jaw fell slack, his shoulders slumped, and he actually seemed to age by years, right before my eyes. Finally he turned and left, a beaten and broken man.

I never saw John again. He sent me an example of the output of his program (I recall that it could do automatic graphing of its results, which was quite an innovation at the time). I promptly filed it under “dumb ideas.” I heard through the grapevine that John kept trying for awhile, halfheartedly, to interest someone in his spreadsheet program, but as I had predicted he was never able to do so, and he faded into obscurity, along with his program.

And that's why you had to wait 15 more years for VisiCalc, Lotus 1-2-3, and Excel.

Jack Crenshaw is a systems engineer and the author of Math Toolkit for Real-Time Programming. He holds a PhD in physics from Auburn University. E-mail him at . For more information about Jack

This site uses Akismet to reduce spam. Learn how your comment data is processed.