multicollineaity Flashcards

1
Q

High levels of multicollinearity make each of the following
unreliable except:
a. Standard Errors
b. T-statistics
c. Point Estimate
d. P-values

A

point estimate

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Multicollinearity occurs when:
a. An independent variable in a multiple regression model can
be linearly predicted by the dependent variable.
b. An independent variable in a multiple regression model can be
linearly predicted by another independent variable.
c. An independent variable in a multiple regression cannot be
linearly predicted by another dependent variable.
d. An independent variable in a multiple regression model
cannot be linearly predicted by another independent variable.

A

b

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Multicollinearity is suspected when ________.
a. there is a low R 2 coupled with significant explanatory variables
b. there is a high R 2 coupled with significant explanatory variables
c. there is a low R 2 coupled with insignificant explanatory variables
d. there is a high R 2 coupled with insignificant explanatory variables

A

d

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

When confronted with multicollinearity, a good remedy is to
________ if we can justify its redundancy.
a. add one more collinear variable
b. drop one of the collinear variables
c. remove both the collinear variables
d. add as many collinear variables as possible

A

b

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

The term multicollinearity refers to the condition when the
variance of the error term, of X1 , X2 , …, X n , is the same for
all observations.
a. True
b. False

A

b

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

An assumption of the classic regression model is that there is
no Multicollinearity.
a. True
b. False

A

b

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

You calculate the VIF for two variables. The R2X1X2 = 0.81
We have Multicollinearity?
a. Yes because the VIF > 5
b. Yes, because the VIF > 1
c. No, because the VIF < 5
d. No, because the VIF < 1

A

a

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Two variables are perfectly collinear when:
A. There is an exact linear relationship between them
B. There is no correlation between them
C. The correlation between them is above 0.5
D. The correlation between them is below -0.5

A

a

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

High degrees of correlations between the X’s can lead to :
A. Attenuating (closer to zero) the variance
B. Decreasing the amount of collinearity
C. Inflating the Variance
D. Does not affect the regression model

A

d

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Calculate the Variance Inflation Factor (VIF) for two variables
where 𝑹𝒙𝟏𝒙𝟐
𝟐 = 0.45
A. 0.45
B. 0.22
C. 1.81
D. 0.69

A

c

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Given the VIF is 1.18, are they likely multicollinear?
A. Yes, because the VIF is less than 5.0
B. Yes, because the VIF is greater than 1
C. No, because the VIF is less than 5.0
D. No, because the VIF is greater than

A

c

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Given a VIF of 30, we might worry that
a. High multicollinearity decreases the standard errors
b. High multicollinearity makes us more likely to reject the null
c. Low multicollinearity makes us generate insignificant t-values
d. High multicollinearity makes us generate insignificant t-values

A

d

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q
A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly