Information Theory 2 Flashcards

1
Q

What is the relationship between messages X and Y when noise equals zero?

A

if noise=0, then X=Y

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What is the equation for mutual information when there is noise between X and Y?

A

mutual info of X and Y = entropy of Y - noise entropy

I(X,Y) = H(Y) – H(Y|X)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

How do you calculate the noise entropy of X->Y?

A

noise entropy = H(Y|X)
entropy of Y given X

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q
A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q
A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q
A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q
A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q
A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q
A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q
A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q
A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly