Linear Regression Flashcards

1
Q

What is Linear Regression?

A

A supervised, regression learning algorithm

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

How does Linear Regression differ from KNN?

A

It does not need to store the data points and is much smaller

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What is the model equation for Linear Regression?

A

ŷ = θ1 x1 + θ2 x2 + … + θd xd + θ0

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What does θ0 represent in the Linear Regression model?

A

Intercept of the line, serves as the offset

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

What is the goal of Linear Regression?

A

Get a line that can fit as much of the data as possible

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

What happens to the line in Linear Regression when there are multiple features?

A

With 2 features, it becomes a plane; with over 2, it becomes a hyperplane

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

What does the term ε represent in the Linear Regression equation?

A

Measurement error or some other random noise

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Fill in the blank: The loss function is also known as the _______.

A

[objective function or cost function]

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

What does the loss function return?

A

A numerical value representing how well the model fits the dataset

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

What is the formula for Linear Regression Loss Function?

A

𝑙(θ, 𝑋, 𝑦) = 1/(2n) Σ(ŷ − 𝑦)²

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

What does a lower value of the loss function indicate?

A

The better the model fits the dataset

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

What is the shape of the Loss Landscape in Linear Regression?

A

Expected to look like a bowl

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

What is the purpose of finding the value of θ in Linear Regression?

A

To minimize the loss function

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

True or False: Gradient Descent tries different parameters until the lowest point is reached.

A

True

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

What is the learning rate in the context of Gradient Descent?

A

It determines how large the update will be

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

What happens if the learning rate α is too small?

A

Convergence may take too long

17
Q

What are the three types of Gradient Descent?

A
  • Gradient Descent
  • Stochastic Gradient Descent
  • Mini-Batch Gradient Descent
18
Q

How is the feature standardized in Linear Regression?

A

Subtract each value by mean of feature, then divide by standard deviation of that feature

19
Q

What is Mean Squared Error (MSE)?

A

1/n Σ(ŷ − 𝑦)²

20
Q

What does the Coefficient of Determination (R²) measure?

A

The proportion of variance in the dependent variable that can be explained by the independent variable(s)

21
Q

List one advantage of Linear Regression.

A
  • Relatively fast to train
  • Relatively fast to test
22
Q

List one disadvantage of Linear Regression.

A
  • Can only be used for regression tasks
  • Features may sometimes need to be standardized
  • Features may sometimes need to be transformed