Week 3A Flashcards

1
Q

Define when an omitted variable bias exists in a OLS estimate

A

When Both:

  1. E(X2|X1) does not = 0
    ( X2 is related to X1)

and

  1. B2 does not = 0
    (X2 has an effect on Y)
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What is an omitted variable bias

A

Violation of Assumption 4 (ZCM)

regression was too simple and another variable was captured in the error term

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What are the five Guass-Markov Assumptions for MLR?

A
  1. Linear in parameters
  2. Random Sampling
  3. No perfect collinearity
  4. Zero Conditional Mean
  5. Homoscedasticity
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What is perfect collinearity means?

A

There is an exact linear relationship among the independent variables

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

What is an endogenous variable

A

When a regressor is correlated to a variable in the error term.
The biased regressor is endogenous
Violates ZCM

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

What is an exogenous variable

A

An unbiased regressor that is not correlated to any variables in the error term

Does not violate ZCM

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Explain multicollinearity

A

When two or more independent variables are highly correlated, making it difficult to isolate the individual effect of each variable on the dependent variable.

Increased the Var(B^j)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Explain Homoscedasticity

A

When the variance of the residuals is constant across all levels of the independent variables

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

What is the Var(bj^) formula and how do its components interact

A

σ^2(Error Variance):
This is the variance of the error term for the entire model — it reflects how much the actual outcomes vary from the model’s predictions.
Higher 𝜎^2 increases the variance of Bj^, making the estimate less precise.
Ideally, we want this to be low.

SSTj = total Variation in Predictor Xj
Measures how much Xj varies in the data.

Low variation in Xj makes it harder to estimate its effect, increasing the variance of Bj^
​We want this to be high.

R^2j from regressing Xj on other predictors
- tells us how much Xj can be explained by other independant variables

  • want to be low
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Explain the Frisch-Waugh theorem

A

States that if you have a MLR, and you regress the other regressors against the regressor of interest, the residuals of this regression is equal to b^j in the original MLR. It represents how Bj^ uniquely explains Y

This is called partialling out

How well did you know this?
1
Not at all
2
3
4
5
Perfectly