Lecture 7 - Linear Regression Flashcards
What is linear regression?
Is a straightforward approach for predicting a quantitative response Y on the basis of a single predictor variable, X.
What is B0 in Y = B0 + B1X?
B0 is the bias term.
B0 and B1 are refered to what?
What can we do if we have these values?
These are called the coefficients or parameters of the model.
With them we can predict y or unknown values of x.
How does linear regression work?
We want to obtain values of the coefficients so that the linear model “fits the data well”. This is the line that follows the shape of the training data.
What process do we use in linear regression?
Define Closeness
Define Search procedure for the best fit.
How do we define “close”?
By measuring what is called the least squares criterion.
What is the residual (in linear regression)?
It is the difference between what the current model gives us and the “right” answer.
What is the residual sum of squares (RSS)?
It is the sum of the squares of the residuals.
What is the closet match?
It is the minimum value of RSS.
What are we trying to find with the RSS?
The lowest possible value.
What is the residual sum of squares (RSS)?
It is the “cost”
How do we find the gradient?
By calculating derivatives
Is the continuous function of gradient decent convex or concave?
It is convex so we can find a minimum. (may not be best solution but is the best the model can do.)