Tutorial on Quantum Information
»Slides of the first lecture
»Slides of the second lecture
Information processing, information transmission, and information security are everyday notions of modern society. But what exactly is information? This seems to be quite a hard question. Analogous complication arises in physical sciences when asking what exactly energy is. A naive approach to define information is to define information as a message contained in a string of symbols, but naturally enough, a similar question about the meaning of "message" arises. In this presentation all potential societal and qualitative connotations of information are stripped away and we will restrict only to the quantitative mathematical aspects of information.
20th century witnessed the birth of quantum mechanics, a novel theory establishing a united way to treat two apparently distinct aspects of microsystems: undulatory and corpuscular. Quantum mechanics did not only bring unification, but also severe philosophical problems on the nature of reality.
Bell Inequalities
Even without a definition of information it is possible to give a relative description of quantum information as information represented in quantum systems; physical systems so small that they governed by quantum mechanics rather than classical mechanics. This description is already sufficient for learning what might become different when entering the realm of quantum information: A brief but a very famous observation [9] published in 1982 shows that in general, quantum information cannot be cloned – a task supposed evident for classical information.
Much stranger behaviour was discovered by Albert Einstein & al. when trying to establish the incompleteness of quantum mechanics as a physical theory [4]. So-called EPR-paradox raised a long-lasting debate between Niels Bohr and Albert Einstein, but the paradox was resolved in satisfactory manner only decades later [1]. Quantum systems can be shown to violate Bell inequalities, which demonstrates that the quantum systems carrying information are different from classical ones in a very fundamental sense.
Von Neumann and Shannon Entropy
The problem of defining information is equivalent to defining the nature of entropy. Intuitively, entropy is some sort of measure of disorder: Entropy is small if the system is in order (in some sense), and large if the system under investigation is in disorder. A notable early approach to quantify entropy was taken by Ludwig Boltzmann [3] in 1872.
John von Neumann was an ingenious mathematician whose contribution to the mathematical structures of quantum mechanics is beyond comparison. He performed a Gedanken Experiment to find a formula
S(ρ) = – Tr ρ log ρ, | (1) |
for the entropy of a quantum state described by a density matrix ρ. Von Neumann entropy formula is perfectly consistent with the notion presented later by Claude Shannon [8]: Von Neumann entropy of a mixed quantum system coincides with the Shannon entropy
H(X) = – ( p1 log p1 + … + pn log pn) | (2) |
of a random variable X corresponding to the quantum state in a very natural way. Equation (2) is worth reconsidering, as it clearly measures the uncertainty of the random variable X: Minimum entropy 0 is obtained with simple distribution pi=1 for some i, and maximum entropy with the uniform distribution. The unit of the entropy depends on the choice of logarithm base: Choosing base 2 refers corresponds to measuring the entropy in bits.
The conditional entropy of X, when the value of another random variable Y is known to be yj is defined via conditional probabilities:
and the conditional entropy of X when Y is known is defined as the expected value
Conditional entropy thus measures the uncertainty of X, when another random variable Y is known. Now, the information of X when Y is known is defined as
which makes perfectly sense: The information that Y gives is shown as the difference in the uncertainties.
An important part of quantum information theory concentrates on cases where X and Y are obtained from a quantum system via measurement, and a famous classical result governing the gained information, so-called Holevo Bound [5] must be mentioned:
where X is random variable with potential values {1,…,n}, and Y corresponds to an observable.
Quantum Cryptography
In 1984, Charles Bennett and Gilles Brassard presented a quantum information -based scheme BB84 for secure communication [2]. The protocol BB84 is actually a key exchange protocol, and the secure communication can thereafter be established by one-time pad encryption. However, it turned out to be a quite difficult task to establish the absolute security of BB84 protocol, and Mayers [6] was apparently the first one to provide the unconditional security proof.
[1] John S. Bell: On the Einstein-Podolsky-Rosen paradox. Physics 1, 195–200 (1964).
[2] Charles H. Bennett and Gilles Brassard: Quantum cryptography: public key distribution and coin tossing, Proceedings of IEEE conference on Computers, Systems, and Signal processing. Bangalore (India), 175–179 (1984).
[3] Ludwig Boltzmann: Weitere Studien über das Wärmegleichgewicht unter Gasmolekülen. Wiener Berichte 66, 275–370 (1872).
[4] Albert Einstein, Boris Podolsky, and Nathan Rosen: Can quantum-mechanical description of physical reality be considered complete? Physical Review 47, 777–780 (1935).
[5] Alexander S. Holevo: Statistical Problems in Quantum Physics, Proceedings of the Second Japan-USSR Symposium on Probability Theory, Eds.: G. Murayama and J.V. Prokhorov, Springer, 104–109 (1973).
[6] Dominic Mayers: Unconditional security in quantum cryptography. Journal of the ACM 48:3, 351–406 (2001).
[7] John von Neumann: Thermodynamik quantummechanischer Gesamheiten. Nachrichten von der Gesellschaft der Wissenschaften zu Göttingen 1, 273–291 (1927).
[8] Claude E. Shannon: A Mathematical Theory of Communication. Bell System Technical Journal 27, 379–423 and 623–656 (1948).
[9] William K. Wootters and Wojciech H. Zurek: A single quantum cannot be cloned. Nature 299, 802–803 (1982).