Entropy Flashcards

1
Q

What is the entropy of a random variable X?

A

H(X) = -ΣxP(x)log2(P(x))

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Entropy of a uniform distribution of N outcomes

A

log2(N)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What has more entropy? A non-unfiorm distribution over N outcomes or a uniform distribution over N outcpmes?

A

Any non-uniform distribution over N outcomes has lower entropy than the corresponding uniform distribution

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What is cross-entropy and what does cross-entropy model/measure?

A

(-1/n)log(P_{M}(W)
How well a model M predicts the data

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

What is perplexity and what does it measure?

A

2^{cross-entropy}, it models the same thing as cross-entropy, how well a probability distribution or probability model predicts a sample.

Better models q of the unknown distribution p will tend to assign higher probabilities q(xi) to the test events. Thus, they have lower perplexity: they are less surprised by the test sample.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly