Week 7 - Information theory Flashcards
1
Q
Entropy
A
H(X) = -E[P(X) * log P(X)]
2
Q
Odds
A
P(X) / (1 - P(X))
3
Q
Logits
A
Log(Odds)
4
Q
P(B | A)
A
P(A n B) / P(A)
5
Q
Joint Entropy
A
H(X, Y) = - Ei [ Ej [ P(xi, yi) log(P(xj, yj))]]
6
Q
Conditional Entropy
A
H(X|Y) = - Ei [ Ej [ P(xi, yi) log(P(yj | xj))]]
7
Q
H(X,Y) - H(X)
A
H(Y|X)
8
Q
Gini Index
A
Ig(p) = 1 - E(i=1 to J)[(pi)^2] where pi is the fraction of items labeled with class i in the dataset and J is the number of classes
9
Q
IG(Y,X)
A
H(Y) - H(Y|X)
10
Q
I(X;Y)
A
H(X) - H(X|Y)
11
Q
Mutual Information
A
Measures how much knowing one variable reduces the uncertainty about the other