Armin Flashcards
Bias-variance trade-off
The tradeoff between bias and variance. In many scenarios, adding bias can decrease the variance in a model. That result in a more accurate prediction. But there is always a tradeoff in findin a sweet spot that can be hard to find. This can however be addressed with cross-validation
Consistency (or lack thereof) in shrinkage methods
If the number of parameters are higher than the number of observations. Shrinkage methods can be used such as shrinkage or lasso. With shrinkage methods we avoid a potential singular matrix we add a penalty term
Svarar ej på frågan?
Support consistency of lasso
????
Linear classifiers and perceptron
Classifiers that classify by trying to separate the groups by a straight line or hyperplane
Generalization error
A measure on how accuracy an algorithm is able to predict outcomes of previously unseen data.
Rademacher complexity
???
Population risk
Expect errors for the loss-function given a known distribution. So to calculate the risk we need to know the distribution.
Empirical Risk
The amount of wrongly classified observations for test-data when we don’t know the underlying distribution
Risk = Observations/ misclassifications