Claude Shannon: Difference between revisions

From Citizendium
Jump to navigation Jump to search
imported>Pat Palmer
imported>Pat Palmer
mNo edit summary
Line 6: Line 6:


Shannon made a critical step enabling hardware design of a computer in his 1938 MIT master's thesis<ref name="Shannon3">{{cite web|url=http://www.research.att.com/~njas/doc/shannonbib.html|title=``A Symbolic Analysis of Relay and Switching Circuits'', MIT master's thesis published in T.A.I.E.E. Vol. 57 (1938), pp. 713-723|publisher= Transactions American Institute of Electrical Engineers|year=1938|accessdate=2007-05-12}}</ref>, in which he associated [[boolean algebra]], a kind of mathematical system that had been known for centuries, with the design of logic gates in digital hardware<ref name="Shannon1">{{cite web|url=http://www.nyu.edu/pages/linguistics/courses/v610003/shan.html|title="Claude Shannon" from Professor Ray C. Dougherty's course notes (V61.0003) Communication: Men, Minds, and Machines (Fall, 1996)|publisher=[[Microsoft Corporation]]|year=1996|accessdate=2007-05-12}}</ref>.  Shannon called boolean algebra "switching algebra" in the context of digital hardware design.
Shannon made a critical step enabling hardware design of a computer in his 1938 MIT master's thesis<ref name="Shannon3">{{cite web|url=http://www.research.att.com/~njas/doc/shannonbib.html|title=``A Symbolic Analysis of Relay and Switching Circuits'', MIT master's thesis published in T.A.I.E.E. Vol. 57 (1938), pp. 713-723|publisher= Transactions American Institute of Electrical Engineers|year=1938|accessdate=2007-05-12}}</ref>, in which he associated [[boolean algebra]], a kind of mathematical system that had been known for centuries, with the design of logic gates in digital hardware<ref name="Shannon1">{{cite web|url=http://www.nyu.edu/pages/linguistics/courses/v610003/shan.html|title="Claude Shannon" from Professor Ray C. Dougherty's course notes (V61.0003) Communication: Men, Minds, and Machines (Fall, 1996)|publisher=[[Microsoft Corporation]]|year=1996|accessdate=2007-05-12}}</ref>.  Shannon called boolean algebra "switching algebra" in the context of digital hardware design.
== Founder of Information Theory ==
The field of [[information theory]] was launched in 1948 by Shannon's ground-breaking, two-part paper "A Mathematical Theory of Communication" <ref>{{cite paper | author = Claude Shannon
| title = A Mathematical Theory of Communication
| journal = Bell System Technical Journal
| date = July & October, 1948
| url = http://plan9.bell-labs.com/cm/ms/what/shannonday/shannon1948.pdf }} </ref>
It was shortly followed by a book [http://www.press.uillinois.edu/books/catalog/67qhn3ym9780252725463.html] which has since been reprinted many times. Information theory is devoted to messages and signals using techniques drawn from mathematical [[probability]], and linking discrete and continuous mathematics in ways that later turned out to be helpful, not just in the fields of communications and computers, but also on thinking about biological processes and linguistics. He was also a pioneer in developing methods for computers to play chess.


== Pioneer of Cryptography ==
== Pioneer of Cryptography ==
Line 17: Line 26:
| pages = pp.656-715
| pages = pp.656-715
| url = http://netlab.cs.ucla.edu/wiki/files/shannon1949.pdf }}</ref> became the seminal paper for cryptography as an academic discipline.
| url = http://netlab.cs.ucla.edu/wiki/files/shannon1949.pdf }}</ref> became the seminal paper for cryptography as an academic discipline.
== Founder of Information Theory ==
The field of [[information theory]] was launched in 1948 by Shannon's ground-breaking, two-part paper "A Mathematical Theory of Communication" <ref>{{cite paper | author = Claude Shannon
| title = A Mathematical Theory of Communication
| journal = Bell System Technical Journal
| date = July & October, 1948
| url = http://plan9.bell-labs.com/cm/ms/what/shannonday/shannon1948.pdf }} </ref>
It was shortly followed by a book [http://www.press.uillinois.edu/books/catalog/67qhn3ym9780252725463.html] which has since been reprinted many times. Information theory is devoted to messages and signals using techniques drawn from mathematical [[probability]], and linking discrete and continuous mathematics in ways that later turned out to be helpful, not just in the fields of communications and computers, but also on thinking about biological processes and linguistics. He was also a pioneer in developing methods for computers to play chess.


== Shannon's publications ==
== Shannon's publications ==

Revision as of 19:28, 26 June 2020

This article is a stub and thus not approved.
Main Article
Discussion
Related Articles  [?]
Bibliography  [?]
External Links  [?]
Citable Version  [?]
 
This editable Main Article is under development and subject to a disclaimer.

Claude Shannon (1916-2001) was a theoretical mathematician and electrical engineer who is generally considered to be one of the foundational researchers in computer and communications design. He studied at M.I.T., spent much of his career at Bell Laboratories, and later returned to M.I.T. as a professor. Recognized as a premier voice in the engineering community from the 1940's onward, Shannon had become a figure of some public and popular acclaim by the time of his retirement. An enormous number of resources exist about him on the web, and also in the deep web (i.e., online resources which must be paid for). In his twilight years, Shannon suffered from Alzheimer's Disease.

Switching Algebra: Application of Boolean algebra to logic gate design (1938)

Shannon made a critical step enabling hardware design of a computer in his 1938 MIT master's thesis[1], in which he associated boolean algebra, a kind of mathematical system that had been known for centuries, with the design of logic gates in digital hardware[2]. Shannon called boolean algebra "switching algebra" in the context of digital hardware design.

Founder of Information Theory

The field of information theory was launched in 1948 by Shannon's ground-breaking, two-part paper "A Mathematical Theory of Communication" [3] It was shortly followed by a book [1] which has since been reprinted many times. Information theory is devoted to messages and signals using techniques drawn from mathematical probability, and linking discrete and continuous mathematics in ways that later turned out to be helpful, not just in the fields of communications and computers, but also on thinking about biological processes and linguistics. He was also a pioneer in developing methods for computers to play chess.

Pioneer of Cryptography

During World War II, Shannon performed classified research for the U. S. government on cryptography. His 1949 "Communication Theory of Secrecy Systems"[4] became the seminal paper for cryptography as an academic discipline.

Shannon's publications

See our list of Shannon's publications.

Notes