Advertisement

From light bulbs to computers

September 29, 2011

Jack Ganssle-September 29, 2011

From Patent 307,031 to a computer laden with 100,000 vacuum tubes, these milestones in first 70 years of electronics made the MCU possible.

Where a calculator on the ENIAC is equipped with 18,000 vacuum tubes and weighs 30 tons, computers in the future may have only 1,000 vacuum tubes and perhaps weigh 1-1/2 tons.
    —Popular Mechanics, 1949

November marks the 40th anniversary of the microprocessor, the circuit element that truly revolutionized the world and gave birth to the field of embedded systems. This is the second installment about this historic development. You can find the first here www.eetimes.com/4219412 (Jack Ganssle, "The microprocessor at 40—The birth of electronics," September 2011, p. 39).

Thomas Edison raced other inventors to develop the first practical electric light bulb, a rather bold undertaking considering there were neither power plants nor electrical wiring to support lighting. In the early 1880s his bulbs glowed, but the glass quickly blackened. Trying to understand the effect, he inserted a third element and found that current flowed in the space between the filament and the electrode. It stopped when he reversed the polarity. Although he was clueless about what was going on--it wasn't until 1897 that J. J. Thomson discovered the electron--Edison filed for a patent and set the idea aside. Patent 307,031 was for the first electronic device in the United States. Edison had invented the diode.

Which lay dormant for decades. True, Ambrose Fleming did revive the idea and found applications for it, but no market appeared.

In the first decade of the new century, Lee De Forest inserted a grid between the anode and cathode, creating what he called an Audion. With this new control element a circuit could amplify, oscillate, and switch--the basic operations of electronics. Now engineers could create radios of fantastic sensitivity, send voices over tens of thousands of miles of cable, and switch ones and zeroes in microseconds.

The vacuum tube was the first active element, and its invention was the beginning of electronics. Active elements are the core technology of every electronic product. The tube, the transistor, and, I believe, now the microprocessor are the active elements that transformed the world over the last century.

Even though the tube was a stunning achievement, it was useless in isolation. De Forest did create amplifiers and other circuits using tubes. But the brilliant Edwin Armstrong was probably the most seminal early inventor of electronic circuits. Although many of his patents were challenged and credit was often given to others, Armstrong was the most prolific of the early radio designers. His inventions included both the regenerative and super-regenerative receivers, the superhetrodyne (a truly innovative approach used to this day), and FM.

As radio was yet another communications technology, not unlike SMS today, demand soared as it always does for these killer apps. Western Electric made the VT1, one of the first commercial tubes. In 2011 dollars, they were a hundred bucks a pop. But war is good for technology. In the four years of World War I, Western Electric alone produced a half million tubes for the U.S. Army. By 1918 over a million a year were being made in the U.S., more than 50 times the pre-conflict numbers; prices quickly fell. Just as cheaper semiconductors always open new markets, falling tube prices meant radios became practical consumer devices.

Radio
Start an Internet publication and no one will read it until there's "content." This is hardly a new concept; radio had little appeal to consumers unless there were radio shows. The first regularly-scheduled broadcasts started in 1919. There were few listeners, but with the growth of broadcasters, demand soared. RCA sold the earliest consumer superhetrodyne radio in 1924; 148,000 flew off the shelves in the very first year. By the crash in 1929, radios were common fixtures in American households and were often the center of evening life for the family, rather like TV is today.

Nearly until the start of World War II, radios were about the most complex pieces of electronics available. An example is RCA's superb RBC-1 single-conversion receiver, which had all of 19 tubes. But tubes wore out, they could break when subjected to a little physical stress, and they ran hot. It was felt that a system with more than a few dozen would be impractically unreliable.

One hundred tubes and counting
In the 1930s, it became apparent that global conflict was inevitable. Governments drove research into war needs, resulting in what I believe is one of the most important contributions to electronic digital computers, and a natural extension of radio technology: RADAR (radio detection and ranging). The U.S. Army fielded its first RADAR apparatus in 1940. The SCR-268 had 110 tubes… and it worked. At the time tubes had a lifetime of a year or so, so one would fail every few days in each RADAR set. ("Set" is perhaps the wrong word for a system that weighed 40,000 Kg and that required six operators.) Over 3,000 SCR-268s were produced.

Ironically, Sir Henry Tizard arrived in the U.S. from Britain with the first useful cavity magnetron the same year the SCR-268 went into production. That tube revolutionized RADAR. By war's end, the 10-cm wavelength SCR-584 was in production (1,700 were manufactured) using 400 vacuum tubes. The engineers at MIT's Rad Lab had shown that large electronic circuits were not only practical, they could be manufactured in quantity and survive combat conditions.

Happy Birthday, 4004
Jack Ganssle's series in honor of the 40th anniversary of the 4004 microprocessor.

Part 1: The microprocessor at 40--The birth of electronics
The 4004 spawned the age of ubiquitous and cheap computing.

Part 2: From light bulbs to computers 
From Patent 307,031 to a computer laden with 100,000 vacuum tubes, these milestones in first 70 years of electronics made the MCU possible.

Part 3: The semiconductor revolution
In part 3 of Jack's series honoring the 40th anniversary of the microprocessor, the minis create a new niche—the embedded system.


Like all major inventions, computers had many fathers--and some mothers. Rooms packed with men manually performed calculations in lockstep to produce ballistics tables and the like; these gentlemen were known as "computers." But WWI pushed most of the men into uniform, so women were recruited to perform the calculations. Many mechanical machines were created by all sorts of inventive people like Charles Babbage and Konrad Zuse. But about the same time the Rad Lab was doing its magnetron magic, what was probably the first electronic digital computer was built. The Atanasoff-Berry computer was fired up in 1942, used about 300 tubes, was not programmable, and though it did work, was quickly discarded.

Meanwhile the Germans were sinking ships faster than the allies could build replacements, in the single month of June, 1942 sending 800,000 tons to the sea floor. Britain was starving and looked doomed. The allies were intercepting much of the Wehrmacht's signal traffic, but it was encrypted using a variety of cryptography machines, the Enigma being the most famous. The story of the breaking of these codes is very long and fascinating, and owes much to pre-war work done by Polish mathematicians, as well as captured secret material from two U-boats. The British set up a code-breaking operation at Bletchley Park, where they built a variety of machines to aid their efforts. An electro-mechanical machine called the Heath Robinson (named after a cartoonist who drew very complex devices meant to accomplish simple tasks, à la Rube Goldberg) helped break the "Tunny" code produced by the German Lorenz ciphering machine. But the Heath Robinson was slow and cranky.

Sixteen hundred tubes
Tommy Flowers realized that a fully electronic machine would be both faster and more reliable. He figured the machine would have between 1,000 and 2,000 tubes, and despite the advances being made in large electronic RADAR systems, few thought such a massive machine could work. But Flowers realized that a big cause of failures was the thermal shock tubes encountered on power cycles, so planned to leave his machine on all of the time. The result was Colossus, a 1,600 tube behemoth that immediately doubled the code breakers' speed. It was delivered in January of 1944. Those who were formerly hostile to Flower's idea were so astonished they ordered four more in March. A month later they were demanding a dozen.

Colossus didn't break the code; instead it compared the encrypted message with another data stream to find likely settings of the encoding machine. It was probably the first programmable electronic computer. Programmable by patch cables and switches, it didn't bear much resemblance to today's stored program machines. Unlike the Atanasoff-Berry machine, the Colossi were useful and essential to the winning of the war.

Churchill strove to keep the Colossus secret and ordered that all be destroyed into pieces no bigger than a man's hand, so nearly 30 years slipped by before its story came out. Despite a dearth of drawings, though, a working replica has been constructed and is on display at the National Museum of Computing at Bletchley Park, a site on the "must visit" list for any engineer. (But it's almost impossible for Americans to find, you'll wind up dizzy from the succession of roundabouts one must navigate.) A rope barrier isolates visitors from the machine's internals, but it's not hard to chat up the staff and get invited to walk inside the machine. That's right—inside. These old systems were huge.

Eighteen thousand tubes
Meanwhile, here in the colonies John Mauchly and J. Presper Eckert were building the ENIAC, a general-purpose monster of a machine containing nearly 18,000 vacuum tubes. It weighed 30 tons and consumed 150 KW of electricity, and had five million hand-soldered joints. ENIAC's purpose was to compute artillery firing tables, which it accelerated by three orders of magnitude over other contemporary approaches. ENIAC didn't come on line until the year after the war, but due to the secrecy surrounding Colossus, ENIAC long held the title of the first programmable electronic computer. It, too, used patch panels rather than a stored program, although later improvements gave it a ROM-like store. One source complained it could take "as long as three weeks to reprogram and debug a program." Those were the good old days. Despite the vast number of tubes, according to Eckert the machine suffered a failure only once every two days. That's about the reliability of my Windows machine.

During construction of the ENIAC Mauchly and Eckert proposed a more advanced machine, the EDVAC. It had a von Neumann architecture (stored program), called that because John von Neumann, a consultant to the Moore School of Electrical Engineering (University of Pennsylvania) where the ENIAC was built, had written a report about EDVAC summarizing its design, and hadn't bothered to credit Mauchly or Eckert for the idea. Whether this omission was deliberate or a mistake (the report was never completed and may have been circulated without von Neumann's knowledge) remains unknown, although much bitterness resulted.

(In an eerily parallel case, the ENIAC was the source of patent bitterness. Mauchly and Eckert had filed for a patent for the machine in 1947, but in the late 1960s Honeywell sued over its validity. John Atanasoff testified that Mauchly had appropriated ideas from the Atanasoff-Berry machine. Ultimately the court ruled that the patent was invalid. Computer historians still debate the verdict.)

Meanwhile British engineers Freddie Williams and Tom Kilburn developed a cathode ray tube that could store data in charge wells on the tube's glass. A metal pickup detected the presence of ones and zeroes. The Williams tube they built was the first random access digital memory device. But how does one test such a product? The answer: build a computer around it. In 1948 the Small Scale Experimental Machine, nicknamed "The Baby," went into operation. It used three Williams tubes, one being the main store (32 words of 32 bits each) and two for registers. Though not meant as a production machine, The Baby was the first stored-program electronic digital computer. It is sometimes called the Mark 1 Prototype, as the ideas were quickly folded into the Manchester Mark 1, the first practical stored-program machine. That morphed into the Ferranti Mark 1, which was the first commercial digital computer.

I'd argue that the circa 1951 Whirlwind computer was the next critical development. Whirlwind was a parallel machine in a day where most computers operated in bit-serial mode to reduce the number of active elements. Although it originally used Williams tubes, the Whirlwind was converted from the slow Williams tubes to core memory--the first time core memory was incorporated into a computer. Core dominated the memory industry until large semiconductor devices became available in the 1970s. Whirlwind's other important legacy is that it was a real-time machine, and it demonstrated that a computer could handle RADAR data. Whirlwind's tests convinced the Air Force that computers could be used to track and intercept Cold War enemy bombers. The government, never loathe to start huge projects, contracted with IBM and MIT to build the Semi-Automatic Ground Environment (SAGE), based on the 32-bit AN/FSQ-7 computer.

One hundred thousand

SAGE was the largest computer ever constructed, each installation using over 100,000 vacuum tubes and a half acre of floor space. Twenty-six such systems were built, and unlike so many huge programs, SAGE was delivered and used until 1983. The irony is that by the time SAGE came on-line in 1963, the Soviets' new ICBM fleet made the system mostly useless.

For billions of years, Mother Nature plied her electrical wiles. A couple of thousand years ago, the Greeks developed theories about electricity, most of which were wrong. With the Enlightenment, natural philosophers generated solid reasoning, backed up by experimental results, that exposed the true nature of electrons and their behavior. In only the last flicker of human existence has that knowledge been translated into the electronics revolution, possibly the defining characteristic of the 20th century.

Next month: The Semiconductor Revolution.

Jack Ganssle (jack@ganssle.com) is a lecturer and consultant specializing in embedded systems development. He has been a columnist with Embedded Systems Design and Embedded.com for over 20 years. For more information on Jack, click here.

This article provided courtesy of Embedded.com and Embedded Systems Design magazine.
See more articles like this one on Embedded.com.
This material was first printed in Embedded Systems Design magazine.
Sign up for subscriptions and newsletters.
Copyright © 2011
UBM--All rights reserved.

Loading comments...