Armin Flashcards

1
Q

Bias-variance trade-off

A

The tradeoff between bias and variance. In many scenarios, adding bias can decrease the variance in a model. That result in a more accurate prediction. But there is always a tradeoff in findin a sweet spot that can be hard to find. This can however be addressed with cross-validation

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Consistency (or lack thereof) in shrinkage methods

A

If the number of parameters are higher than the number of observations. Shrinkage methods can be used such as shrinkage or lasso. With shrinkage methods we avoid a potential singular matrix we add a penalty term

Svarar ej på frågan?

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Support consistency of lasso

A

????

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Linear classifiers and perceptron

A

Classifiers that classify by trying to separate the groups by a straight line or hyperplane

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Generalization error

A

A measure on how accuracy an algorithm is able to predict outcomes of previously unseen data.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Rademacher complexity

A

???

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Population risk

A

Expect errors for the loss-function given a known distribution. So to calculate the risk we need to know the distribution.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Empirical Risk

A

The amount of wrongly classified observations for test-data when we don’t know the underlying distribution

Risk = Observations/ misclassifications

How well did you know this?
1
Not at all
2
3
4
5
Perfectly