4. Gaussian models NEW Flashcards
<b>INTRODUCTION</b>
<b>Basics</b>
- MVN def
- Mahalanobis distance in the MVN
- eigendecomposition of covariance matrix
- How eigenvectors, eigenvalues, and mu affect the countours of equal probability density of a Gaussian
<b>MLE for an MVN</b>
p. 99
<b>GAUSSIAN DISCRIMINANT ANALYSIS</b>
- class conditional density in the GDA
- when GDA is equivalent do naive Bayes?
- why GDA can be thought of as a nearest centroids classifier?
- the formula to classify a new test vector in GDA assuming a uniform prior.
<b>Quadratic discriminant analysis (QDA)</b>
- posterior over class labels in QDA
<b>Linar discriminant analysis (LDA)</b>
- LDA as a special case of QDA
- LDA and softmax function
- Softmax and Boltzmann distribution
<b>Two-class LDA</b>
<b>MLE for discriminant analysis</b>
<b>Strategies for preventing overfitting</b>
<b>Regularized LDA</b>
<b>Diagonal LDA</b>
<b>Nearest shrunken centroids classifier</b>
p. 103
<b>INFERENCE IN JOINTLY GAUSSIAN</b>
p. 112
<b>LINEAR GAUSSIAN SYSTEMS</b>
p. 121
<b>THE WISHART DISTRIBUTION</b>
p. 128
<b>INFERRING THE PARAMETERS OF AN MVN</b>
p. 129