The information theorist - Embedded.com

The information theorist

While recently reading The Information: a history, a theory, a flood , (Harper Collins, 2013) by James Gleick, I was reminded of something that I had almost forgotten: April 30 would have been Claude Shannon's 99th birthday (he was born in 1916), the father of modern information theory. It is a day that should be more widely noted given his contributions, and I had promised myself I would not forget it.

James Gleick's book chronicles the emergence of information theory and the role of Claude Shannon and others have had in its development.(Source: Harper Collins)

James Gleick's book chronicles the emergence of information theory and the role of Claude Shannon and others have had in its development. (Source: Harper Collins)

I went back to Gleick’s book to use it as an antidote to all the recent blah-blah and attention in the media to yet another just-graduated college kid being called a genius because he had one bright idea and the good luck to be noticed and funded by venture capitalists.

Reading about Shannon's life reminds me of what true genius is all about. With one exception – building a training computer to be sold to business men who wanted to learn more about them – Shannon never founded any companies and never made the obscene amounts of money some of today's “one trick ponies” have made. But he was rich in ideas that have fundamentally affected not only computers and communications but our understanding of how our world in general works, such as in molecular genetics.

Claude Shannon(Source: http://history-computer.com/)

Claude Shannon (Source: http://history-computer.com/)

Much of Shannon's fame in electronics and communications has to do with his seminal work in a 1949 paper “Mathematical Theory of Communication.” It dealt with the fundamental limits on signal processing as it relates to compressing data and reliably storing and communicating it. But Shannon's impact was much broader. Indeed almost every aspect of modern computing and communications has felt the influence of his ideas.


Claude Shannon demonstrates machine learning. http://techchannel.att.com/play-video.cfm/2010/3/16/In-Their-Own-Words-Claude-Shannon-Demonstrates-Machine-Learning  (Source: AT&T Archives and History Center, Warren, NJ)

I first became familiar with Shannon and his work in undergraduate classes in electrical engineering and computer science at about the time of the introduction of the Intel 4040 while I was working at the California Institute of Technology. Then most of our attention was trying to get our heads around how to use his noisy channel coding theorem to figure out the best possible error-correcting methods and level of compression to apply to a signal before whatever information in it was indistinguishable from noise.

But until I read Gleick's book I did not realize how widespread the impact of information theory and what a major role Shannon played in the development of natural language processing, cryptography, neurobiology, molecular genetics, quantum computing, and pattern detection, to name a few. Shannon’s ideas did not just trigger others to look the impact of information theory on these new areas. He was there in the middle of the fray, generating key papers in all of them. One key paper he wrote, for example, explained the role information theory played in how DNA replicates without too many mistakes in its coding of gene sequences.

Before he started his life's work on information theory in the late '40s and early '50s, even his graduate school studies in the 1930s foreshadowed much of what modern computing is all about. His work then has served as the foundation of today’s practical digital circuit design techniques. A paper based on his master's thesis at the age of 23 was on the use of Boolean logic in computers and communications and was awarded the American Institute of American Engineers Award in 1939.

There was also much in his life that would make him a hero to today's DIYers. As a teenager, when the telegraph was still the primary means of long distance communications, he built radio-controlled model airplanes and even a wireless telegraph system to a friend's house, at a time when wired communications was the norm. Later in life for his own amusement he built such things as a rocket-powered flying disc, a motorized pogo stick, a flame-throwing trumpet and a calculator for doing arithmetic using Roman numerals (I, II, …V…X…) that he called THROBAC I.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.