Linear Regression and Gradient Descent Flashcards

You may prefer our related Brainscape-certified flashcards:
1
Q

What is a linear regression model in machine learning?

A

A model that attempts to fit a line to best describe a set of data.

It is based on supervised learning.

You could say:
Linear regression models describe the relationship between one or more predictor variable(s) and one outcome variable:

e.g. the variable, work experience (x) can predict the variable, salary (y).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What is linear regression models used for

A

It is mostly used for finding out the relationship between variables and forecasting.

e.g. Linear regression models perform the task to predict a dependent variable value, salary (y) based on a given independent variable, experience (x).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What are the two methods we learned for optimizing linear regression?

A

1: Root-Mean-Square-Residual (RMSE) AKA Least-Squares.
2: Normal Equation

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

How does RMSE AKA Least-Squares work

A

It finds the best fitting line that describes a data-set.

It does this by calculating the sum of the distance between the data points to the fitting line. Then it attempts to create a new line to minimize the error of the sum.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

How does the normal equation work?

A

It simply computes the best fitting line from a set of data.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

What is the advantage and disadvantages of the normal equation

A

Advantages: The result of the normal equation is precise.

Disadvantage: The normal equation is expensive computationally. Said in another way, it does not scale very well with large data sets.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Are linear regression supervised or unsupervised learning?

A

We train the model with examples of the variable we are trying to predict. Therefore it is supervised learning.
For example, we have a salary (y) and work experience (x), which we train our model with, in order to try and predict future salaries based on x.

For it to be unsupervised learning, we would only have input, so, work experience (x) and then guess the salary from that.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

To find the best fitting line what parameters can we change?

What do we use to find the best fitting line?

A

Change the Intercept (+b) and the slope (a) of the line

We use a cost function to calculate the error.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

What is the cost function for linear regression?

A

The cost function to linear regression is RMSE; Root-Mean-Square-Error.

It’s updated with gradient descent.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

What is the cost function describing?

A

Cost Function describes the error between predicted values and expected values.

Essentially, it sums up the squared error of the observations.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

What does Gradient Descent and backpropagation do?

A

backpropagation is updating the weights and gradient descent is a way of optimizing it. How I understood it is that there are different ways to do back propagation.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Why do we Square the RMSE?

A

Because points sitting above the current fitting line will detract from the sum of error because the value of the result will be negative. Making the fitting seem better than it is.

In short: to prevent negative values to be added to the sum of error.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

What is the elbow method?

A

The elbow method is used to find the point of where the amount of error reduced, by adding one cluster more is decreasing more. The point where it does not pay off the same

How well did you know this?
1
Not at all
2
3
4
5
Perfectly