lecture 13 NMF Flashcards
PCA
Orthogonal
Sparse pca: components orthogonal& sparse
Nmf
Latent representation and latent features are non- negative
Nmf
Easier to interpret
No cancellation like pca
No sign ambiguity
Can learn over complete representation
Soft clustering
Downside of Nmf
Only applied to non- negative
Interpretable is hit/ miss
Non- convex optimization
Slow on large dataset
Not orthogonal
Manifold learning
Allow more complex transformations
Better visulizatjon -> less than 2
New representation of training not test
Good for eda
Tsne
Random 2D
Points - close
Far——-far
Emphasis on points that are close by
Outliers— parametric
Elliptic — Gaussian model— fit robust covariants matrix and mean
Only works if Gaussian assumption is reasonable
Kernel density
Non- parametric model
Need to adjust kernel bandwidth
Not good in high dimension
One class svm
Use Gaussian kernel to cover data
Select support vectors
Select gamma
Specify outlier ratio
Isolation tree
Outliers are easier to isolate from the rest