L3 Flashcards

1
Q

Optimization for Linear Regression

A

Calculate the parameters:
1. Analytical method: No training process, directly obtain solution.
2. Gradient descent: Iterative approach to the optimal solution.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Simple linear regression

A

Summarize and study relationships between two variables.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Mean Square Error (MSE)

A

The average of the squared differences between the predicted and actual values.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

RSME

A

The square root of Mean Error

The average deviation at the same scale of the original data.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Variance

A

The expectation of the squared deviation of a random variable from its population mean or sample mean.

*One variable

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Covariance

A

When two random variables X and Y are not independent, it is frequently of interest of assess how strongly they are related to one another.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Covariance Relationship

A

If positive, x and y values tend to rise together.
If small, x and y are weakly related.
If large, x and y are strongly related.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Simple Linear Regression Function

A

Covariance / Variance

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Polynomial regression

A

A special case of linear regression, where the relationship between variables is modeled as a polynomial.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Key Steps for Polynomial Regression

A

Transform the input data into polynomial features.
Formulate the polynomial model.
Apply the least squares method to find the coefficients.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Least Squares is an ____.

A

Orthogonal Vector Projection.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

In linear regression, the residuals should be

A

geometrically perpendicular to the regression line (the best fit line)

This is because the residuals are the orthogonal projections of the data points onto the line of best fit.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Least Square Time complexity

A

O(n^3)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Least Square Space Complexity

A

O(m^2 + mn)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly