Lecture 40- Multiple Linear Regression 2 Flashcards

1
Q

What is this lecture largely about?

A

Accessing whether our model for multiple linear regression is any good

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What does adding in parameters do?

A

Adding in variables change estimates for parameters, change the model

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Do the hypothesis test on slide 766…

A

Answers on slide

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Read the R output on slide 767 and determine which variables in the multiple linear regression are valuable and what needs to got rid of

A

Answers in slide/ notes

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

What T distribution does multiple linear regression follow when trying to complete a hypothesis test?

A

n-k-1

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Is it fine to leave in variables that aren’t significant to the model?

A

No, they can be detremetrial as create extra noise

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

When you refit a variable what do you have to make sure you do?

A

You need to refit the model to find the optimal parameter estimates

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Read output on slide 776 and calculate a 95% confidence interval…

A

Answers on slide

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

What assumptions does multiple linear regression need to follow?

A

Linearity: There is a straight line relationship between µY and xj
when all other predictor variables are held constant.

Independence The responses Y1, Y2, . . . , Yn are statistically
independent.

Normality The error terms e1, e2, . . . , en come from a normal
distribution.

Equal variance The errors terms all have the same variance, σ
2 ‘homoscedastic’)

Plotting residuals against fitted values is useful for this

How well did you know this?
1
Not at all
2
3
4
5
Perfectly