The transistor: sixty years old and still switching
Sixty years ago this month, scientists at Bell Labs demonstrated the most important invention of the 20th century: the first real transistor.
It's hard to say when the electronics age started, but William Sturgeon's 1825 development of the electromagnet laid the seeds that led to Joseph Henry's crude telegraph in 1830, which was the first electrical system used to communicate over long distances (a mile). Just 14 years later, Samuel Morse sent a message by telegraph over a 40-mile link he had strung between Washington DC and Baltimore.
Considering the primitive nature of telegraphy at the time, it's astonishing just how quickly the demand grew. By 1851 Western Union was in business, and in the same decade Cyrus Field had connected the Old and New Worlds via a fragile cable that failed a mere three weeks after the first message was sent. But later attempts succeeded. Instantaneous transatlantic communication quickly lost its novelty.
Although Alexander Graham Bell's 1875 invention of the telephone is universally lauded today, it was a less than practical device till Thomas Edison came up with the carbon microphone two years later. The speaker's voice modulated a pack of carbon granules, changing the circuit's resistance and thus sending a signal to the receiver.
A number of inventors soon came up with the idea of wireless transmission, codified by Guglielmo Marconi's 1896 patent and subsequent demonstrations. Like the telephone and telegraph early radios used neither CPUs, transistors, nor vacuum tubes. Marconi, drawing on the work of others, particularly Nikola Tesla, used a high voltage and spark gap to induce electromagnetic waves into a coil and an antenna. The signals, impossibly noisy by today's standards, radiated all over the spectrum . . . but they worked. In fact, Titanic's famous SOS was broadcast using a 5 KW spark gap set manufactured by the Marconi Wireless Telegraph Company.
The circuits were electrical, not electronic.
Telephone signals, though, degraded quickly over distance while radio remained crude and of limited range. The world desperately needed devices that could control the flow of the newly discovered electron. About this time Ambrose Fleming realized that the strange flow of electricity in a vacuum Edison had stumbled on could rectify an alternating current, which has the happy benefit of detecting radio waves. He invented the first simple vacuum tube diode. But it didn't find much commercial success due to high costs and the current needed by the filament.
In the first decade of the new century, Lee de Forest inserted a grid in the tube between the anode and cathode. With this new control element, a circuit could amplify, oscillate, and switch. Those are the basic operations of any bit of electronics. With the tube, engineers learned they could create radios of fantastic sensitivity, send voices over tens of thousands of miles of cable, and switch ones and zeroes in microseconds. During the four years of World War I, Western Electric alone produced a half million tubes for the U.S. Army. By 1918, over a million a year were being made in the U.S., more than fifty times the pre-war numbers.
Electronics was born.
Electronics is defined as "the science dealing with the development and application of devices and systems involving the flow of electrons in a vacuum, in gaseous media, and in semiconductors," and the word came into being nearly at the same time the tube was created. But that's a lousy definition. I think the difference between electrical and electronic circuits is that the latter uses "active" elements, components that rectify, switch, or amplify. The very first active devices may have been cats whisker crystals, a bit of springy wire touching a raw hunk of galena that works as a primitive diode. I can't find much about their origins, but it seems these crystals first appeared shortly before Fleming did his pioneering vacuum tube research. It's rather ironic that this, the first active element, which predated the tube, was a semiconductor, but that nearly another half century was required to "discover" semiconductors.
Radios originally sported just a few tubes but soon high-end units used dozens. In the late 1960s, I had a military-surplus 1940-era RBC radio receiver that had 19 tubes. Reputedly it cost $2,400 in 1940 (over $33k today). The $600 toilet seat is nothing new.
Increasing capability lead then, as it still does today, to ever-escalating cries for more features, speed, and functionality. The invention of radar in World War II created heavier demands for active electronics. Some sets used hundreds of tubes. Perhaps the crowning achievement of vacuum-tube technology was the ENIAC in 1946, which employed some 18,000. The machine failed every two days. Clearly, the advent of digital technology had pushed tubes to their very limits. A new sort of active element was needed, something that produced less waste heat, used vastly fewer watts, and was reliable.
Serendipitously, the very next year Walter Brattain and John Bardeen (who, with William Shockley won the 1956 Nobel Prize for this and related semiconductor work) invented the transistor. Though some claim this was the first "practical" such semiconductor, the Bell Labs scientists had actually constructed a point-contact transistor, a difficult-to-manufacture device that is no longer used and whose use was never widespread.