Week 7 - Information theory Flashcards

1
Q

Entropy

A

H(X) = -E[P(X) * log P(X)]

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Odds

A

P(X) / (1 - P(X))

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Logits

A

Log(Odds)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

P(B | A)

A

P(A n B) / P(A)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Joint Entropy

A

H(X, Y) = - Ei [ Ej [ P(xi, yi) log(P(xj, yj))]]

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Conditional Entropy

A

H(X|Y) = - Ei [ Ej [ P(xi, yi) log(P(yj | xj))]]

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

H(X,Y) - H(X)

A

H(Y|X)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Gini Index

A

Ig(p) = 1 - E(i=1 to J)[(pi)^2] where pi is the fraction of items labeled with class i in the dataset and J is the number of classes

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

IG(Y,X)

A

H(Y) - H(Y|X)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

I(X;Y)

A

H(X) - H(X|Y)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Mutual Information

A

Measures how much knowing one variable reduces the uncertainty about the other

How well did you know this?
1
Not at all
2
3
4
5
Perfectly