Chapter 6 - Linear model selection and regularization Flashcards

1
Q

The lasso relative to least squares, is…?

A
  • The lasso will reduce the number of variables.
  • The model is less flexible model.
  • The model has higher bias and lower variance.
  • The prediction accuracy can be improved because of the less variance and more bias.
  • The model flexibility decreases as λ increases. This leads to the estimated parameters approaching zero and some become zero which decreases the variance of the predictions and increase in bias.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Ridge regression relative to least squares, is…?

A
  • Ridge regression will reduce the number of variables
  • The model is less flexible model.
  • The model has higher bias and lower variance.
  • The prediction accuracy can be improved because of the less variance and more bias.
  • The model flexibility decreases as λ increases. This leads to the estimated parameters approaching zero, but all variables still has non-zero coefficients, which decreases the variance of the predictions and increases in bias.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Non-linear methods relative to least squares, is…?

A
  • Non-linear methods are in general more flexible
  • Prediction accuracy can improved
  • The variance is increase
  • The bias is decreased
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What happens to the training RSS in Ridge Regression when lambda increases from 0?

A

Steadily increase. When λ increases, the more emphasis is place on the second term, penalty, which puts weight in the model coefficients resulting in a less flexible mode. A less flexible model will have a higher training RSS since the training data is not fit as accurately.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

What happens to the test RSS in Ridge Regression when lambda increases from 0?

A

Decrease initially, and then eventually start increasing in a U shape. When λ increases, the more emphasis is place on the second term, penalty , which puts weight in the model coefficients resulting in a less flexible model. Initially, the test RSS will likely decrease since the variance is decreasing, but after a while, when λ is large enough, the increase in bias will lead to the test RSS increasing again.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

What happens to the model flexibility in Ridge Regression when lambda increases from 0?

A

Model flexibility decreases since the parameters are shrinking.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

What happens to the variance in Ridge Regression when lambda increases from 0?

A

Steadily decrease. The variance decreases with less model flexibility. The model depends heavily on the training data. When λ approaches infinity, the model will predict a constant value where the variance is at its minimum. All the coefficients will be zero giving no variance.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

What happens to the bias in Ridge Regression when lambda increases from 0?

A

Steadily increase. The bias increases with less model flexibility. The model will fit the training data less accurately. When λ approaches infinity, the model will predict a constant value where the bias is at its maximum.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

What happens to the irreducible error in Ridge Regression when lambda increases from 0?

A

Remain constant. The irreducible error will not change when we change the model.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly