application of shannon theory to quantum...
Post on 26-Mar-2018
217 Views
Preview:
TRANSCRIPT
Information theory of quantum entanglement Shengqiao Luo
UCDavis physics department
sqluo@ucdavis.edu Abstract
When applying Shannon theory to the quantum system, it shows how Shannon information is difference in classical measure and quantum measure. It helps on understanding how to extend information theory for quantum entanglement, and other unified information laws for both classical and quantum cases. Also, there are cases for the GHZ and W class three qubits states for in-‐depth statics and analysis.
Introduction: 1 Motivation for the topic 2 Why its interesting 3 Synopsis of project and results Motivation: I am very interest in quantum entanglement. I want to know more about quantum measures from comparing the classical information and quantum information by looking at the information diagram both classically and quantum mechanically. Synopsis: By comparing different cases of applying information diagram quantum mechanically, it showed how Shannon information is inadequately on quantum measure. The definition of the mutual information and all other quantities basically broke from the classical perspective. Next, I am trying to understand the quantities in Shannon information quantum mechanically. My guess is information theory quantities tell something about the quantum entanglement. Background: Introduction of information diagram In information theory, an information diagram is a type of Venn diagram. It shows the relationships of the Shannon’s measures of information, like entropy, joint entropy, conditional entropy and mutual information.
Information Measures One random variable: X ⇠ Pr(x) Entropy: H[X] Two random variables: X ⇠ Pr(x) Y ⇠ Pr(y) (X, Y ) ⇠ Pr(x, y) Information theory quantities: Joint entropy: H[X, Y ] Conditional Entropies: H[X|Y ], H[Y |X] Mutual Information: I [X ; Y ]
Three random variables: X ⇠ Pr(x) Y ⇠ Pr(y) Z ⇠ Pr(z) (X,Y,Z) ⇠ Pr(x,y,z) Information theory quantities Entropy: H[X], H[Y], H[Z] Joint entropy: H[X,Y, Z] Conditional entropies: H[X|Y, Z], H[Y |X, Z], H[Z|X, Y ] H[X, Y |Z], H[X, Z|Y ], H[Y, Z|X] Conditional mutual information: I[X ; Y |Z ]
[2]
H[x]
Mutual information: I[X;Y;Z] Dynamical system: 1 Describe the particular system you've selected. 2 Give the equations of motion 3 Describe how the terms model various aspects of the system. Quantum mechanic system (Hilbert space) Introduction quantum mechanic system The equations of motion Talking about quantum entanglement( the inner connections of the systems of quantum mechanics)
Methods: Comparing different cases of information diagram in quantum mechanics and information diagram classically when applying Shannon theory in calculating both cases.
Classical information diagram with two variables Example: P(11) = P(00) = ½ Assuming there are two subsystems, each of two subsystem can choose either 0 or 1. For this specific example, There are only two options for the two subsystems when we constrain both subsystems to be in the state(two variables have the same numbers). Then we have P(11) = P(00) = ½ Calculations: (two variables, classical)
Information theory quantities: Entropy: H[X] = H[Y] = 1 Joint entropy: H[X, Y ] = 1 Conditional Entropies: H[X|Y ] = 0 H[Y |X] = 0 Mutual Information: I [X ; Y ] = 1 Quantum information diagram with two variables Examples:
Assuming there are two subsystems, each of two subsystem can choose either 0 or 1. For this specific example, There are only two states we have for the whole system is |00> and |11>. Then we have the state of the whole system is
Calculations If ρ is written in terms of its eigenvectors |1〉, |2〉, |3〉, ... as
The von Neumann entropy is
We choose log base 2 here
Information theory quantities: -‐Calculated by Shannon theory. Entropy: H[X] = H[Y] = 1 Joint entropy: H[X, Y ] = 0 Conditional Entropies: H[X|Y ] = -‐1 H[Y |X] = -‐1 Mutual Information: I [X ; Y ] = 2
Another example with 3 variables: Classical Information diagram with three variables P(111) = P(000) = ½ Assuming there are three subsystems, each of three subsystem can choose either 0 or 1. For this specific example, There are only two options for the three subsystems which all subsystems have to be the same number. Then we have P(111) = P(000) = 1/2 Calculation:
Information theory quantities Entropy: H[X] = H[Y] = H[Z] = 1 Joint entropy: H[X,Y, Z] = 1 Conditional entropies: H[X|Y, Z] = H[Y |X, Z] = H[Z|X, Y ] =H[X, Y |Z] = H[X, Z|Y ] = H[Y, Z|X] = 0 Conditional mutual information: I[X ; Y |Z ] = 1 Mutual information: I[X;Y;Z] = 1
Quantum Information diagram with three variables
Assuming there are three subsystems, each of two subsystem can choose either 0 or 1. For this specific example, There are only two states we have for the whole system is |000> and |111>. Then we have the state of the whole system is |GHZ> = ( |000> + |111>)/root(2) Calculation:
Information theory quantities -‐Calculated by Shannon theory Entropy: H[X] = H[Y] = H[Z] = 1 Joint entropy: H[X,Y, Z] = 0 Conditional entropies: H[X|Y, Z] = H[Y |X, Z] = H[Z|X, Y ] =H[X, Y |Z] = H[X, Z|Y ] = H[Y, Z|X] = -‐1 Conditional mutual information: I[X ; Y |Z ] = 1 Mutual information: I[X;Y;Z] = 0
Interesting observation: If cut out the green subsystem, then we have this completely classical two subsystem. For GHZ state, if disposing the one of the subsystem, then the rest of the subsystems are completely non-‐entangled with each other which means they are completely classical.
Results: Conclusion from comparing the two cases For three variables: only the three-‐ way mutual information can be negative for classical case, but for quantum case, the conditional information can be negative. So, for instance H[C] + H[A,B|C] = H[A,B,C]
Quantum: When we measure more, we have either more or less uncertainty. Classical: We measure more, we have more uncertainty.
Some more general statistics and analysis: Mutual information about GHZ class states statistics -‐What are GHZ class states:
The W class states and GHZ class states are in the class of the three qubits states that can be separable whatever how grouping the subsystems. These states can’t be transformed into LOCC operator. The property of GHZ class states is that they maintain the strongest tripartite entanglement, but if one of the three systems is disposed, then there’s no entanglement between the rest two systems.
-‐What we care about these states:
Conditional mutual information, I[X:Y|Z], I[Y:Z|X], I[X:Z|Y]
I[A;B] = H(A) + H(B) -‐ H(A,B) Mutual information of ABC, I[A;B;C] I[A;B;C] = H(A) + H(B) + H(C) -‐ H(A,B) – H(A,C) – H(B,C) + H(A,B,C)
-‐Method
Looking at the GHZ class states with a and b parameters attached with each of the terms in the state like: a |000> + b|111>
a and b are normalized. I look at how the conditional mutual information, I[A;B] , I[B;C] , I[A;C] and the mutual information, I[A;B;C] change along a.
-‐Result
Mutual information For mutual information, since ABC subsystems together is always in pure state, mutual information from the equation above should be always zero, because H(A) = H(B,C), H(B) = H(A,C), H(C) = H(A,B), and H(A,B,C) = 0.
So, I[A;B;C] = H(A) + H(B) + H(C) -‐ H(A,B) – H(A,C) – H(B,C) + H(A,B,C) = 0 Entropy of H(A), H(B),H(C)
1)Looping over a from 0.0~1.0 and see H(A) = H(B,C), also because of GHZ state is a symmetric state for the three subsystems here specifically means that it will be the same state if I dispose any one of the three system, so H(A) = H(B) = (C) = H(A,B) = H(B,C) = H(A,C) at all times whatever a is.
Conditional mutual information
1) Random values scatter plots of a from -‐a.0~1.0 and see I[A;B] = H(A) + H(B) -‐ H(A,B). Because the symmetry reason above, H(A) = H(B) = (C) = H(A,B) = H(B,C) = H(A,C), I[A;B] = H(A) + H(B) -‐ H(A,B) = H(A) = H(B) = (C) = H(A,B) = H(B,C) = H(A,C).
-‐Conclusion
For GHZ class states for three qubits, when the normalized parameters are equal, the conditional entropy I[A;B] = I[B;C] = I[A;C] =H(A) = H(B) = (C) = H(A,B) = H(B,C) = H(A,C)(because of symmetry) are the biggest. For three qubits system in pure state, I[A;B;C] is always zero.
Mutual information about W class states statistics -‐What are W class states:
The W class states and GHZ class states are in the class of the three qubits states that can be separable whatever how grouping the subsystems. These states can’t be transformed into LOCC operator. The property of W class states is that they maintain the strongest two subsystems entanglement if one of the three systems is disposed.
-‐What we care about these states:
Conditional mutual information, I[X:Y|Z], I[Y:Z|X], I[X:Z|Y]
I[A;B] = H(A) + H(B) -‐ H(A,B) Mutual information of ABC, I[A;B;C] I[A;B;C] = H(A) + H(B) + H(C) -‐ H(A,B) – H(A,C) – H(B,C) + H(A,B,C)
-‐Method
Looking at the W class states with a, b and c parameters attached with each of the terms in the state like: a |100> + b|010> + c|001>
a, b and c are normalized. I look at how the conditional mutual information, I[A;B] , I[B;C] , I[A;C] and the mutual information, I[A;B;C] change along a.
-‐Result
Mutual information For mutual information, since ABC subsystems together is always in pure state, mutual information from the equation above should be always zero, because H(A) = H(B,C), H(B) = H(A,C), H(C) = H(A,B), and H(A,B,C) = 0. So, I[A;B;C] = H(A) + H(B) + H(C) -‐ H(A,B) – H(A,C) – H(B,C) + H(A,B,C) = 0 Entropy of H(A), H(B),H(C)
1) Random values scatter plots of a from -‐a.0~1.0and see H(A) = H(B,C), H(B) = H(A,C) , H(C) = H(A,B).
First part: less states statistics
Conditional mutual information
1) Random values scatter plots of a from -‐1.0~1.0 and see
I[A;B] = H(A) + H(B) -‐ H(A,B).
-‐Conclusion
If I dispose any one of the three systems, the state for ABC will actually be different. The a parameter is specially associate with |100>. If a is very small, then the parameters for b and c are very big which enables the “bell” pair states which is the maximally entangled states to be able to show the pattern in the state when a is very small, so I[B;C] which contains the classical and quantum mutual information is bigger because these two systems have stronger correlation with a is smaller. This conclusion is the same for making b or c small. This conclusion is consistent with the property of W class state which is if dispose one of the three subsystems, the rest of the two systems are still very entangle with each other—having high robustness. ( Although mutual information is not completely the same for talking about quantum entanglement, but mutual information contains the quantum correlation, so the conclusion is still very parallel with the property of W class states.)
top related