Linear Models Flashcards

1
Q

How does Ridge Regression work?

A

Ridge regression (aka regularized least squares) adds a regularization parameter to penalize overly complex models.
The regularization parameter lambda can be chosen with cross-validation

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

How complicated functions can ridge regression learn?

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What does the objective function of ridge regression measure?

A

(objective function is the target that your model tries to optimize) Shrinkage method. The objective is to shrink some of the parameters to zero.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

How do you control model complexity in Ridge Regression?

A

ith the hyper parameter lambda.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

How does a linear model work, what does it mean if a coefficient in the learned w-vector is large/small/zero?

A

zero means no relationship, + value positive relationship, etc. 1.0 is a perfect correlation.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

What kind of algorithm do you need for training ridge regression?

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Name other linear regression/classification methods.

A

lasso regression (good for feature selection)
Elastic net
Support Vector regression

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

What’s a linear model?

A

The linear model, a simple yet popular choice:
f (x) = w1 · x1 + … + wd · xd + b
▶ x1, …, xd , feature values
▶ w1, …,wd model coefficients
▶ b ∈ R intercept term

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

How to find the regression line in a single feature case?

A

you try to find a line or non-linear function where the mean square error in the y direction is minimized (sum of squared errors in the y direction)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

What is the downside of the classical least squares method (Gaussian) and what is the solution?

A

Works fine in smaller dimensions, prone to overfitting in high dimensions. Sensitive to outliers. Solution: ridge regression.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly