Claude Shannon
Claude Shannon’s 1948 paper A Mathematical Theory of Communication founded the field of information theory by defining information as reduction of uncertainty and giving it a measure (entropy in bits).
¶Core ideas
- Channel capacity: an upper bound on reliable communication rate over a noisy channel
- Entropy as information: bits as a measure of surprise/uncertainty in a message source
- Source-channel separation: compress, then add error correction — these can be designed independently
¶Key works
- A Mathematical Theory of Communication (1948)
- The Mathematical Theory of Communication (with Weaver, 1949)
Last reviewed .