MODULE 2 S2.2 Flashcards
Linear Regression, Ridge and Lasso
A constraint where each feature should have as little effect on the outcome as possible (which translates to having a small slope), while still predicting well.
Regularization
The ‘slope’ parameter is also called _______ or coefficients.
weight
Regression where the coefficients (w) are chosen not only so that they can predict well on the training data, but also to fit an additional constraint.
Ridge Regression
T/F Linear Regression is also known as Ordinal Least Squares.
FALSE (Ordinary)
It is the sum of the squared differences between the predictions and the true values.
Mean Squared Error
It is a linear model for classification problems.
Logistic Regression
Logistic Regression happens by fitting a logistic function, also known as the _______________
sigmoid function
An alternative to Ridge for regularizing linear regression.
Lasso
It is also referred to as w, weights, coefficients.
Slope
T/F In Linear Regression, the final output of the model is numeric value (numeric predictions).
TRUE
The algorithm used for solving regression problems.
Linear Regression
It makes a prediction using a linear function of the input features.
Linear Model
T/F In ridge regression, a higher alpha means a more restricted model, so we expect the entries of coef_ to have smaller magnitude for a high value of alpha than for a low value of alpha.
TRUE
T/F When comparing training set and test set scores, we find that we predict very accurately on the training set, but the R2 on the test set is much worse. This is a sign of underfitting.
FALSE
T/F Ridge regression is a linear regression model that controls complexity to avoid overfitting.
TRUE