T1: 4. Information Theory Flashcards

1
Q

Define Shannon entropy (words)

A

A measure of how unsure we are about the value of X, or on average how much information we gain when we learn the value of X.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Define Shannon entropy (formula)

A

H(X) = -SUM_x[p(x)log(p(x)]

where x = {finite set}

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What is 0log(0) in this case?

A

0

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Define Shannon’s Noiseless Coding Theorem

A

The Shannon entropy gives a lower bound for the number of bits per letter required to encode a message.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

What probability would specify N yes or no questions to Shannon entropy?

A

p(x) = (1/2)^n

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

What distribution of probabilities maximises H(X)

A

p(x_n)=1/n

Equal probability for each outcome.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Give the formula for joint entropy

A

H(X,Y) = -SUM_x,y[p(x,y)log(p(x,y)]

where x = {finite set}

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

How does joint entropy relate to individual entropies of two events X and Y?

Under what condition is this equality?

A

H(X,Y) <= H(X) + H(Y)

If X and Y are independent, with p(x,y)=p(x)p(y).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Where is relative entropy useful?

A

Where we have the two random variables which take the same values but with different probability distributions.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

State the relative entropy formula H(p(x)||q(x))

A
  • H(X) - SUM_x p(x) log q(x)
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Under what case is relative entropy =0?

A

p(x) = q(x)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Define the Von-Neumann entropy

A

S(ρ) = Tr (ρ logρ) = SUM_i ei log ei

where ei are eigenvalues of ρ

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Define the relative Von-Neumann entropy S(ρ_1 ||ρ_2)

A

Tr (ρ_1 log ρ_1) - Tr (ρ_1 log ρ_2) >= 0

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Define conditional uncertainty H(X|Y)

A

This is the remaining uncertainty in X given we know the value of Y

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

State the conditional uncertainty formula H(X|Y)

A

H(X|Y) = H(X,Y) - H(Y)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Define mutual information of X and Y: H(X:Y)=H(Y:X)

A

H(X) + H(Y ) - H(X, Y )

17
Q

Given a bipartite system with density matrices ρ_a, ρ_b and ρ for the combined system, state each VN entropy

A

S(A) = S(ρ_a)
S(B) = S(ρ_b)
S(A,B) = S(ρ)

18
Q

Why can conditional VN entropy be negative?

A

If ρ is a pure state, S(A, B) = 0 but this does not imply S(B) = 0. If Bob makes a measurement on the pure state, unless he CC to Alice the result, she has a mixed state.