Topic 7: Multiple Linear Regression Flashcards

1
Q

What extra assumptions are introduced in multiple variable regression?

A
  • No exact collinearity between X variables
  • No specification bias
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What does an estimate in multiple linear regression mean?

A

The change in Y caused by a change in x, holding all other variables constant

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

How does the correlation between regressors affect the error of the estimates

A

Greater the correlation, higher the error

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Why do we use Adjusted R^2?

A

Because normal R^2 can be increased just by adding junk regressors. Adjusted R^2 compensates for the number of variables

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

When can we compare R^2 values?

A
  • When sample size is the same
  • When dependant variables are the same
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Give the formula for adjusted R^2

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Does R^2 have any intrinsic properties that might favour its use over other calculations?

A

Nope, pretty arbitrary

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

What is the Gross/Simple correlation coefficient?

A

Shown as r1-2, where 1 = Y and i > 1 = Xi. Shows the correlation between two variables

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

What is the partial correlation coefficient?

A

The correlation between two variables, eliminating the correlation effect from some other variables. Shown as r12.34, where the effects from 3 and 4 are eliminated

How well did you know this?
1
Not at all
2
3
4
5
Perfectly