Chapter 6 - Linear model selection and regularization Flashcards
The lasso relative to least squares, is…?
- The lasso will reduce the number of variables.
- The model is less flexible model.
- The model has higher bias and lower variance.
- The prediction accuracy can be improved because of the less variance and more bias.
- The model flexibility decreases as λ increases. This leads to the estimated parameters approaching zero and some become zero which decreases the variance of the predictions and increase in bias.
Ridge regression relative to least squares, is…?
- Ridge regression will reduce the number of variables
- The model is less flexible model.
- The model has higher bias and lower variance.
- The prediction accuracy can be improved because of the less variance and more bias.
- The model flexibility decreases as λ increases. This leads to the estimated parameters approaching zero, but all variables still has non-zero coefficients, which decreases the variance of the predictions and increases in bias.
Non-linear methods relative to least squares, is…?
- Non-linear methods are in general more flexible
- Prediction accuracy can improved
- The variance is increase
- The bias is decreased
What happens to the training RSS in Ridge Regression when lambda increases from 0?
Steadily increase. When λ increases, the more emphasis is place on the second term, penalty, which puts weight in the model coefficients resulting in a less flexible mode. A less flexible model will have a higher training RSS since the training data is not fit as accurately.
What happens to the test RSS in Ridge Regression when lambda increases from 0?
Decrease initially, and then eventually start increasing in a U shape. When λ increases, the more emphasis is place on the second term, penalty , which puts weight in the model coefficients resulting in a less flexible model. Initially, the test RSS will likely decrease since the variance is decreasing, but after a while, when λ is large enough, the increase in bias will lead to the test RSS increasing again.
What happens to the model flexibility in Ridge Regression when lambda increases from 0?
Model flexibility decreases since the parameters are shrinking.
What happens to the variance in Ridge Regression when lambda increases from 0?
Steadily decrease. The variance decreases with less model flexibility. The model depends heavily on the training data. When λ approaches infinity, the model will predict a constant value where the variance is at its minimum. All the coefficients will be zero giving no variance.
What happens to the bias in Ridge Regression when lambda increases from 0?
Steadily increase. The bias increases with less model flexibility. The model will fit the training data less accurately. When λ approaches infinity, the model will predict a constant value where the bias is at its maximum.
What happens to the irreducible error in Ridge Regression when lambda increases from 0?
Remain constant. The irreducible error will not change when we change the model.