Information Theory 2 Flashcards
1
Q
What is the relationship between messages X and Y when noise equals zero?
A
if noise=0, then X=Y
2
Q
What is the equation for mutual information when there is noise between X and Y?
A
mutual info of X and Y = entropy of Y - noise entropy
I(X,Y) = H(Y) – H(Y|X)
3
Q
How do you calculate the noise entropy of X->Y?
A
noise entropy = H(Y|X)
entropy of Y given X
4
Q
A
5
Q
A
6
Q
A
6
Q
A
7
Q
A
8
Q
A
9
Q
A
10
Q
A