multicollineaity Flashcards
High levels of multicollinearity make each of the following
unreliable except:
a. Standard Errors
b. T-statistics
c. Point Estimate
d. P-values
point estimate
Multicollinearity occurs when:
a. An independent variable in a multiple regression model can
be linearly predicted by the dependent variable.
b. An independent variable in a multiple regression model can be
linearly predicted by another independent variable.
c. An independent variable in a multiple regression cannot be
linearly predicted by another dependent variable.
d. An independent variable in a multiple regression model
cannot be linearly predicted by another independent variable.
b
Multicollinearity is suspected when ________.
a. there is a low R 2 coupled with significant explanatory variables
b. there is a high R 2 coupled with significant explanatory variables
c. there is a low R 2 coupled with insignificant explanatory variables
d. there is a high R 2 coupled with insignificant explanatory variables
d
When confronted with multicollinearity, a good remedy is to
________ if we can justify its redundancy.
a. add one more collinear variable
b. drop one of the collinear variables
c. remove both the collinear variables
d. add as many collinear variables as possible
b
The term multicollinearity refers to the condition when the
variance of the error term, of X1 , X2 , …, X n , is the same for
all observations.
a. True
b. False
b
An assumption of the classic regression model is that there is
no Multicollinearity.
a. True
b. False
b
You calculate the VIF for two variables. The R2X1X2 = 0.81
We have Multicollinearity?
a. Yes because the VIF > 5
b. Yes, because the VIF > 1
c. No, because the VIF < 5
d. No, because the VIF < 1
a
Two variables are perfectly collinear when:
A. There is an exact linear relationship between them
B. There is no correlation between them
C. The correlation between them is above 0.5
D. The correlation between them is below -0.5
a
High degrees of correlations between the X’s can lead to :
A. Attenuating (closer to zero) the variance
B. Decreasing the amount of collinearity
C. Inflating the Variance
D. Does not affect the regression model
d
Calculate the Variance Inflation Factor (VIF) for two variables
where 𝑹𝒙𝟏𝒙𝟐
𝟐 = 0.45
A. 0.45
B. 0.22
C. 1.81
D. 0.69
c
Given the VIF is 1.18, are they likely multicollinear?
A. Yes, because the VIF is less than 5.0
B. Yes, because the VIF is greater than 1
C. No, because the VIF is less than 5.0
D. No, because the VIF is greater than
c
Given a VIF of 30, we might worry that
a. High multicollinearity decreases the standard errors
b. High multicollinearity makes us more likely to reject the null
c. Low multicollinearity makes us generate insignificant t-values
d. High multicollinearity makes us generate insignificant t-values
d