L3 Flashcards
Optimization for Linear Regression
Calculate the parameters:
1. Analytical method: No training process, directly obtain solution.
2. Gradient descent: Iterative approach to the optimal solution.
Simple linear regression
Summarize and study relationships between two variables.
Mean Square Error (MSE)
The average of the squared differences between the predicted and actual values.
RSME
The square root of Mean Error
The average deviation at the same scale of the original data.
Variance
The expectation of the squared deviation of a random variable from its population mean or sample mean.
*One variable
Covariance
When two random variables X and Y are not independent, it is frequently of interest of assess how strongly they are related to one another.
Covariance Relationship
If positive, x and y values tend to rise together.
If small, x and y are weakly related.
If large, x and y are strongly related.
Simple Linear Regression Function
Covariance / Variance
Polynomial regression
A special case of linear regression, where the relationship between variables is modeled as a polynomial.
Key Steps for Polynomial Regression
Transform the input data into polynomial features.
Formulate the polynomial model.
Apply the least squares method to find the coefficients.
Least Squares is an ____.
Orthogonal Vector Projection.
In linear regression, the residuals should be
geometrically perpendicular to the regression line (the best fit line)
This is because the residuals are the orthogonal projections of the data points onto the line of best fit.
Least Square Time complexity
O(n^3)
Least Square Space Complexity
O(m^2 + mn)