Quantum communication Flashcards
Derivative of log_a(x)
1/(ln(a)x)
Shannon’s noiseless coding theorem
H(X) is the minimum average numer of bits to store one of Alice’s messages ie a message x_i can be compressed to an average of H(X) bits
How are 0 and 1 distributed for optimally encoded messages?
They have equal frequency
Classical joint entropy =
The information we would get if we observed X and Y at the same time.
Classical conditional entropy =
The entropy of X conditional on knowing Y - tells us how uncertain Bob is about X after measuring Y.
Classical mutual information =
The amount of information obtained about X by observing Y and vice versa
Channel capacity =
The amount of information taht can be transmitted in one use of the channel
Noiseless channel =
A channel where X is perfectly correlated with Y
Shannon’s noisy channel theorem?
Channel capactiy is the max value of the mutual information where the max is taken over all probability distributions of X.
Von neuman entropy
-Tr (p log_2(p))
Schaumer’s quantum noiseless channel coding theorem =
p in a d dimensional Hilbert space can be reliably compressed and descompresssed to a quantum state in a Hilbert space with dimension 2^s(p) ie it can be represented by s(p) qubits.
How does a quantum channel transform p
Draw a diagram of how momentum engtanglement is generated
A photon can only go down a if the other goes down b and vice versa due to phasematching
Draw a diagram of how polarisation entanglement is generated
How does quantum dense coding work?
- Alice and Bob share an entangled state eg a Bell state
- Alice encodes a message in her qubit by applying a local operation to it. This transforms the qubit to one of four different Bell states, so 2 bits of information are strored.
- Alice sends her qubit to Bob via the quantum channel.
- Bob measures the qubits in the Bell basis