Linear Regression Flashcards
What is Linear Regression?
A supervised, regression learning algorithm
How does Linear Regression differ from KNN?
It does not need to store the data points and is much smaller
What is the model equation for Linear Regression?
ŷ = θ1 x1 + θ2 x2 + … + θd xd + θ0
What does θ0 represent in the Linear Regression model?
Intercept of the line, serves as the offset
What is the goal of Linear Regression?
Get a line that can fit as much of the data as possible
What happens to the line in Linear Regression when there are multiple features?
With 2 features, it becomes a plane; with over 2, it becomes a hyperplane
What does the term ε represent in the Linear Regression equation?
Measurement error or some other random noise
Fill in the blank: The loss function is also known as the _______.
[objective function or cost function]
What does the loss function return?
A numerical value representing how well the model fits the dataset
What is the formula for Linear Regression Loss Function?
𝑙(θ, 𝑋, 𝑦) = 1/(2n) Σ(ŷ − 𝑦)²
What does a lower value of the loss function indicate?
The better the model fits the dataset
What is the shape of the Loss Landscape in Linear Regression?
Expected to look like a bowl
What is the purpose of finding the value of θ in Linear Regression?
To minimize the loss function
True or False: Gradient Descent tries different parameters until the lowest point is reached.
True
What is the learning rate in the context of Gradient Descent?
It determines how large the update will be
What happens if the learning rate α is too small?
Convergence may take too long
What are the three types of Gradient Descent?
- Gradient Descent
- Stochastic Gradient Descent
- Mini-Batch Gradient Descent
How is the feature standardized in Linear Regression?
Subtract each value by mean of feature, then divide by standard deviation of that feature
What is Mean Squared Error (MSE)?
1/n Σ(ŷ − 𝑦)²
What does the Coefficient of Determination (R²) measure?
The proportion of variance in the dependent variable that can be explained by the independent variable(s)
List one advantage of Linear Regression.
- Relatively fast to train
- Relatively fast to test
List one disadvantage of Linear Regression.
- Can only be used for regression tasks
- Features may sometimes need to be standardized
- Features may sometimes need to be transformed