Advertisement

Remembering the memories

January 31, 2010

Jack Ganssle-January 31, 2010

September's column was a walkabout through the history of memory devices. I highlighted a few favorite examples and promised more in October.

I forgot. Dropped a bit somewhere. Maybe a cosmic ray induced a neuronal parity error with no ECC. There's a lesson there in the importance of building reliable memory... or in the fallibility of middle-aged brains.

Memory is endlessly fascinating. The development of language meant human experience could be passed between the generations in the form of oral traditions. From safety-critical information that taught youngsters how to avoid the poisonous berries evolved stories, some true, some not, and some just family history. The biblical begats probably closely mirror everyman's yearning to live forever, if only in familial memory.

In Guns, Germs and Steel (W.W. Norton & Co, 1999), Jared Diamond explains how the invention of agriculture created wealth in primitive society; wealth that meant not everyone had to scramble for a living. Petty rulers emerged, kings that could live off the labor of their subjects. Other bureaucrats emerged, too, and probably some of these folks, free of the endless demands of the farm, learned to inscribe symbols on various surfaces. The invention of writing changed, well, maybe not a whole lot for the average Joe. It wasn't till the 17th century that large percentages of the Western world were able to read and write. Here in Carroll County, Maryland, the local historical society found that in a "sizeable percentage" of marriage licenses issued between 1910 and 1915, the betrothed signed with an X, as they were unable to form their own signature.

Memory is what makes a computer different from digital logic circuits. Because of it, we can build one relatively simple circuit--the CPU--that can implement virtually any sort of functionality. The notion of a stored program computer emerged from Alan Turing's work in the 1930s and was independently invented by Konrad Zuse in Germany at about the same time. The great computer engineers J. Presper Eckert and John Mauchly (who developed ENIAC and so much more) also came up with the idea, but John von Neumann was credited with the invention when he wrote a paper about ENIAC that was circulated in an early form sans acknowledgement of the machine's two designers.

Zuse, a largely unheralded genius till recent years, went on to build the first programmable digital computer in 1941. The Z3 didn't use FPGAs, transistors, or tubes; it employed 2,400 relays instead. According to http://ed-thelen.org/comp-hist/Zuse_Z1_and_Z3.pdf about 1,800 of those implemented the memory bank of 64 twenty-two bit words. That's puzzling as 64 x 22 is 1,408, but imagine building memory out of relays! Needless to say the machine, running at around a 5-Hz clock rate, wouldn't give even a digital watch a run for the money.

There's a relay-based computer running today in Oregon. Harry Porter built one that used relays for register storage, though he employed SRAM for main memory. See http://web.cecs.pdx.edu/~harry/Relay/. A cheat, perhaps, but it illustrates the maxim that the memory is always much bigger than the processor.

< Previous
Page 1 of 4
Next >

Loading comments...