Go BackTime Line
Claude Elwood Shannon 

April 30, 1916 Gaylord, Michigan USA
24 Feb 2001, Medford, Massachusetts, USA
construct.gif (4126 bytes)


claude shannon

principal papers

1940 thesis on the use of Boole's algebra to analyse and optimise relay switching circuits.

1938 A Symbolic Analysis of Relay and Switching Circuits

1948 A Mathematical Theory of Communication

1949 Communication theory of secrecy systems

1950 "Programming a computer for playing chess"

hardware

software

keyords

see also

Related Subjects
cryptology

 

Achievement

Noted as a founder of information theory, Claude Shannon combined mathematical theories with engineering principles to set the stage for the development of the digital computer. The term ‘bit,’ today used to describe individual units of information processed by a computer, was coined from Shannon’s research in the 1940s.

He published A Mathematical Theory of Communication in in the Bell System Technical Journal (1948). His work founded the subject of information theory and he proposed a linear schematic model of a communications system. He gave a method of analysing a sequence of error terms in a signal to find their inherent variety, matching them to the designed variety of the control system.

In 1952 he devised an experiment illustrating the capabilities of telephone relays.

 

Biography

 

A Midwesterner, Claude Shannon was born in Gaylord, Michigan in 1916. From an early age, he showed an affinity for both engineering and mathematics, and graduated from Michigan University with degrees in both disciplines. For his advanced degrees, he chose to attend the Massachusetts Institute of Technology.
There he wrote a thesis on the use of Boole's algebra to analyse and optimise relay switching circuits. He joined Bell Telephones in 1941 as a research mathematician and remained there until 1972.

At the time, MIT was one of a number of prestigious institutions conducting research that would eventually formulate the basis for what is now known as the information sciences. Its faculty included mathematician Norbert Wiener, who would later coin the term cybernetics to describe the work in information theories that he, Shannon and other leading American mathematicians were conducting; and Vannevar Bush, MIT’s dean of engineering, who in the early 1930s had built an analog computer called the Differential Analyzer

The Differential Analyzer was developed to calculate complex equations that tabulators and calculators of the day were unable to address. It was a mechanical computer, using a series of gears and shafts to engage cogs until the equation was solved. Once it completed its cycle, the answer to the equation was obtained by measuring the changes in position of its various machine parts. Its only electrical parts were the motors used to drive the gears.

With its crude rods, gears and axles, the analyzer looked like a child’s erector set. Setting it up to work one equation could take two to three days; solving the same equation could take equally as long, if not longer. In order to work a new problem, the entire machine, which took up several hundred feet of floor space, had to be torn apart and reset to a new mechanical configuration.

While at MIT, Shannon studied with both Wiener and Bush. Noted as a ‘tinkerer,’ he was ideally suited to working on the Differential Analyzer, and would set it up to run equations for other scientists. At Bush’s suggestion, Shannon also studied the operation of the analyzer’s relay circuits for his master’s thesis. This analysis formed the basis for Shannon’s influential 1938 paper "A Symbolic Analysis of Relay and Switching Circuits," in which he put forth his developing theories on the relationship of symbolic logic to relay circuits. This paper, and the theories it contained, would have a seminal impact on the development of information processing machines and systems in the years to come.

Shannon’s paper provided a glimpse into the future of information processing. While studying the relay switches on the Differential Equalizer as they went about formulating an equation, Shannon noted that the switches were always either open or closed, or on and off. This led him to think about a mathematical way to describe the open and closed states, and he recalled the logical theories of mathematician George Boole, who in the middle 1800s advanced what he called the logic of thought, in which all equations were reduced to a binary system consisting of zeros and ones.

Boole’s theory, which formulated the basis for Boolean algebra, stated that a statement of logic carried a one if true and a zero if false. Shannon theorized that a switch in the on position would equate to a Boolean one. In the off position, it was a zero.

By reducing information to a series of ones and zeros, Shannon wrote, information could be processed by using on-off switches. He also suggested that these switches could be connected in such a way to allow them to perform more complex equations that would go beyond simple ‘yes’ and ‘no’ statements to ‘and’, ‘or’ or ‘not’ operations.

Shannon graduated from MIT in 1940 with both a master’s degree and doctorate in mathematics. After graduation, he spent a year as a National Research Fellow at the Institute for Advanced Study at Princeton University, where he worked with mathematician and physicist Hermann Weyl. In 1941, Shannon joined the Bell Telephone Laboratories, where he became a member of a group of scientists charged with the tasks of developing more efficient information transmitting methods and improving the reliability of long-distance telephone and telegraph lines.

Shannon believed that information was no different than any other quantity and therefore could be manipulated by a machine. He applied his earlier research to the problem at hand, again using Boolean logic to develop a model that reduced information to its most simple form--a binary system of yes/no choices, which could be presented by a 1/0 binary code. By applying set codes to information as it was transmitted, the noise it picked up during transmission could be minimized, thereby improving the quality of information transmission.

In the late 1940s, Shannon’s research was presented in The Mathematical Theory of Communications, which he co-authored with mathematician Warren Weaver. It was in this work that Shannon first introduced the word ‘bit,’ comprised of the first two and the last letter of ‘binary digit’ and coined by his colleague John W. Turley, to describe the yes-no decision that lay at the core of his theories.

In the 1950s, Shannon turned his efforts to developing what was then called "intelligent machines,"–mechanisms that emulated the operations of the human mind to solve problems. Of his inventions during that time, the best known was a maze-solving mouse called Theseus, which used magnetic relays to learn how to maneuver through a metal maze.

Shannon’s information theories eventually saw application in a number of disciplines in which language is a factor, including linguistics, phonetics, psychology and cryptography, which was an early love of Shannon’s. His theories also became a cornerstone of the developing field of artificial intelligence, and in 1956 he was instrumental in convening a conference at Dartmouth College that was the first major effort in organizing artificial intelligence research.

He was afflicted by Alzheimer's disease, and he spent his last few years in a Massachusetts nursing home.

He is survived by his wife, Mary Elizabeth Moore Shannon; a son, Andrew Moore Shannon; a daughter, Margarita Shannon; a sister, Catherine S. Kay; and two granddaughters

 

Chronology

 

1916

Claude Shannon was born in Gaylord, Michigan, USA

1938

Shannon’s published his paper "A Symbolic Analysis of Relay and Switching Circuits,"

1940

PhD and Masters in mathematics at Massachusetts Institute of Technology. There he wrote a thesis on the use of Boole's algebra to analyse and optimise relay switching circuits.

1941 - 1972

Bell Telephones as a research mathematician.

1948

Publlished: "A Mathematical Theory of Communication" in in the Bell System Technical Journal

1949

On 27 March Shannon married Mary Elizabeth Moore.

1949

Published "Communication theory of secrecy systems"

1950

He devised chess playing programs and an electronic mouse which could solve maze problems. The chess playing program appeared in the paper "Programming a computer for playing chess".

1952

Devised an experiment illustrating the capabilities of telephone relays.

1956

Was instrumental in convening a conference at Dartmouth College that was the first major effort in organizing artificial intelligence research

Visiting professor of communication sciences and mathematics at the Massachusetts Institute of Technology

Shannon's paper led to the first chess game played by the Los Alamos MANIAC computer

Published a paper showing that a universal Turing machine may be constructed with only two states.

1957 - 1977

Appointed to the Faculty of Mathematics at MIT

1958

Donner Professor of Science at MIT

1977

Retired

2001, feb 24

Died of Alzheimer disease at age 84

 

 

Honors and awards


Shannon was awarded the National Medal of Science in 1966.

American Institute of American Engineers Award in 1940,

the National Medal of Science in 1966,

the Audio Engineering Society Gold Medal in 1985,

the Kyoto Prize in 1985.

 

 

Go BackTime Line Last Updated on February 12, 2004 For suggestions  please mail the editors 

 

Footnotes & References

1 J O'Connor and E F Robertson
2 Jones International and Jones Digital Century, 1994-99
http://www-history.mcs.st-andrews.ac.uk/history/Mathematicians/Shannon.html
  http://www-gap.dcs.st-and.ac.uk/~history/Mathematicians/Shannon.html
  http://www.aam314.vzz.net/Shannon.html
  http://www-gap.dcs.st-and.ac.uk/~history/Obits2/Shannon_Guardian.html