4. Gaussian models NEW Flashcards

1
Q

<b>INTRODUCTION</b>
<b>Basics</b>
- MVN def
- Mahalanobis distance in the MVN
- eigendecomposition of covariance matrix
- How eigenvectors, eigenvalues, and mu affect the countours of equal probability density of a Gaussian
<b>MLE for an MVN</b>

A

p. 99

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

<b>GAUSSIAN DISCRIMINANT ANALYSIS</b>
- class conditional density in the GDA
- when GDA is equivalent do naive Bayes?
- why GDA can be thought of as a nearest centroids classifier?
- the formula to classify a new test vector in GDA assuming a uniform prior.
<b>Quadratic discriminant analysis (QDA)</b>
- posterior over class labels in QDA
<b>Linar discriminant analysis (LDA)</b>
- LDA as a special case of QDA
- LDA and softmax function
- Softmax and Boltzmann distribution
<b>Two-class LDA</b>
<b>MLE for discriminant analysis</b>
<b>Strategies for preventing overfitting</b>
<b>Regularized LDA</b>
<b>Diagonal LDA</b>
<b>Nearest shrunken centroids classifier</b>

A

p. 103

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

<b>INFERENCE IN JOINTLY GAUSSIAN</b>

A

p. 112

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

<b>LINEAR GAUSSIAN SYSTEMS</b>

A

p. 121

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

<b>THE WISHART DISTRIBUTION</b>

A

p. 128

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

<b>INFERRING THE PARAMETERS OF AN MVN</b>

A

p. 129

How well did you know this?
1
Not at all
2
3
4
5
Perfectly