# Fifty (+17) Years of Shannon Theory

Today I came across a very interesting article by Sergio Verdú published to the IEEE Transactions on Information Theory in 1998 to celebrate the 50th anniversary of Claude E. Shannon’s Magna Carta of the information era. Verdú gives a brief cronicle of the historical footsteps in building a theory of the fundamental limits behind both data compression and data transmission. The tentacles of Shanon’s paper have reached, besides communication engineering, many other fields such as mathematics, physics, cryptology, economics, biology and even linguistics.

Before 1948, the major communication systems at that time already provided some of the crucial ingredients that would enable information theory as we know it today: the Morse code (1830) is indeed an efficient way to encode information in the duration of the signals contained in each symbol, or spread spectrum techniques (1940) showed that bandwidth is another important parameter in reliable communication. Attemps to quantize the amount of transmitted information, rate, or capacity were given by Nyquist (1924), Küpfmüller (1924), Hartley (1928), and Kotel’nikov (1933). The followint letter was indeed used to denote the amount of information associated to a number of states:

What Nyquist and Hartley missed was a probabilistic or statsitical treatment of the sources of information. In fact, probabilistic modeling of information sources had a very long history from cryptography. As early as 1380 and 1658, tables of frequencies of letters and pairs of letters had been compiled for the purpose of decrypting secret messages! Precisely at the conclusion of his WWII work on cryptography, Shannon prepared a *classified* report where he included several of the notions, including entropy and the words information theory. After graduating from MIT, a twenty-two-year-old Shannon came up with a ground-breaking abstraction of the communication process which he could develop upon joining Bell Labs in 1941.

By 1948, the need of a theory of communication to describe the fundamental limits of nowadays trivial concepts such as bandwidth, signal-to-noise ratio or data rates was evident and recognized by the scientific community. In only a few months, several theories were put on the table with the hope of success (these include Clavier, Earp, Goldman, Laplume, Shannon, Tuller and Wiener). **One of those theories would prove to be everlasting.**

Read the full article here.