Data Analysis week 7 Flashcards

1
Q

What is the residual e_i defined as

A

The difference between the datapoint y_i and the predicted value. So y_i = alpha * x_i + beta + e_i

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What is the residual sum of squares

A

The residual sum of squares (RSS) is e_1^2 + e_2^2 + … + e_n^2. It measures how badly the model explains the data (in its loss function).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What is the relation between alpha and beta and the RSS

A

Computing alpha and beta with the covariance results in the regression model with the smallest possible RSS.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What is a loss function

A

Functions that show how badly a model explains data.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

What is another example of a loss function

A

The mean absolute error

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

What is so special about RSS as a loss function

A

The values for alpha and beta that minimize RSS can be mathematically derived, while for other loss functions, you need a computer for this. And RSS can be motivated by viewing linear regression as a statistic model.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

What other role can the normal distribution play

A

The normal distribution can be an approximation of the residuals.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

What do likelihood functions tell you

A

Likelihood functions tell you how good your model is. Likelihood is high when your model is good.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q
A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly