# Monthly Archives: May 2015

I write this post along with the sad news of the car accident which ended the life of one of the most brilliant mathematicians of the XX century, John Forbes Nash. While battling against his mental illness, he achieved to solve some of the most important theoretical problems, for which he received both the Nobel Prize in Economic Science in 1994 and the Abel Prize of Mathematics in 2015.

To my opinion, what made Nash a beautilful mind was his unique way of approaching problems. Probably, because he worked on his own and had a wide spread view of mathematics research (in fact, unlike many mathematicians, he was not a specialist), he could tackle some of the most famous open problems in mathematics, such as an elliptic partial differential equations problem suggested by Louis Nirenberg. Nirenberg himself gives the following answer to the question whether there is a mathematicians you would consider as a genius: “I can think of one, and that’s John Nash (…) He had a remarkable mind. He thought abound things differently from other people”.

During this semester, I have been attending an Information Theory Course at the Barcelona Graduate School of Mathematics, and the last part of the course has been devoted to Quantum Information Theory, at the hands of ICREA Professor Andreas Winter. The science of the very small, where interactions between matter and energy involves a discrete treatment of quantities, has fostered the development of a mathematical approach to the information that is held in the states of a quantum system. Due to the Heinsenberg uncertainty principle, it is impossible to measure the state of a quantum system, and hence express it in terms of classical information, i.e., bits.

One of the fundamental tools in quantum information theory is Verschränkung or entanglement, a phyisical phenomenon implying that the quantum state of a pair of entangled or correlated particles cannot be described separately. Entanglement was the topic of a paper by Albert Einstein in 1935, which came to be known as the EPR paradox: physical reality described by quantum mechanics is incomplete. An example is the following quantum system, in bra-ket notation:

$|\Phi^+\rangle=\frac{1}{\sqrt{2}}\left(|00\rangle+|11\rangle\right)$.

The term entanglement was in fact coined by Erwin Schrödinger at the same year, when he thought of the problem of interpretation of quantum superposition applied to a cat’s life. The cat, viewed as a quantum system, can be simultaneously in two states: alive or dead. The experiment goes as follows: a cat, a flask of poison, and a radioactive source are placed in a sealed box. If an internal monitor detects radioactivity (i.e. a single atom decaying), the flask is shattered, releasing the poison that kills the cat. The Copenhagen interpretation of quantum mechanics implies that after a while, the cat is simultaneously alive and dead. Yet, when one looks in the box, one sees the cat either alive or dead, not both alive and dead. This poses the question of when exactly quantum superposition ends and reality collapses into one possibility or the other.

Today I came across a very interesting article by Sergio Verdú published to the IEEE Transactions on Information Theory in 1998 to celebrate the 50th anniversary of Claude E. Shannon’s Magna Carta of the information era. Verdú gives a brief cronicle of the historical footsteps in building a theory of the fundamental limits behind both data compression and data transmission. The tentacles of Shanon’s paper have reached, besides communication engineering, many other fields such as mathematics, physics, cryptology, economics, biology and even linguistics.

Before 1948, the major communication systems at that time already provided some of the crucial ingredients that would enable information theory as we know it today: the Morse code (1830) is indeed an efficient way to encode information in the duration of the signals contained in each symbol, or spread spectrum techniques (1940) showed that bandwidth is another important parameter in reliable communication. Attemps to quantize the amount of transmitted information, rate, or capacity were given by Nyquist (1924), Küpfmüller (1924), Hartley (1928), and Kotel’nikov (1933). The followint letter was indeed used to denote the amount of information associated to a number of states:

$H=\log(K)$

What Nyquist and Hartley missed was a probabilistic or statsitical treatment of the sources of information. In fact, probabilistic modeling of information sources had a very long history from cryptography. As early as 1380 and 1658, tables of frequencies of letters and pairs of letters had been compiled for the purpose of decrypting secret messages! Precisely at the conclusion of his WWII work on cryptography, Shannon prepared a classified report where he included several of the notions, including entropy and the words information theory. After graduating from MIT, a twenty-two-year-old Shannon came up with a ground-breaking abstraction of the communication process which he could develop upon joining Bell Labs in 1941.

By 1948, the need of a theory of communication to describe the fundamental limits of nowadays trivial concepts such as bandwidth, signal-to-noise ratio or data rates was evident and recognized by the scientific community. In only a few months, several theories were put on the table with the hope of success (these include Clavier, Earp, Goldman, Laplume, Shannon, Tuller and Wiener). One of those theories would prove to be everlasting.