PP2: OLS Flashcards

1
Q

List the 6 asssumptions to OLS

A

1)The population model is linear
2)n observations are randomly drawn from the population
3)There is variation in each explanatory variable and no perfect collinarity
4)The explanatory variables are uncorrelated to the error term
5)Error u has the same variance regardless of explanaory variables (aka homoskedasticity)
6)The error term is independent of regressors and normally distributed wih mean zero and variance

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What happens when you have omitted variable bais?

A

Some of the omitted variable is absorbed into the error term and into other known variables to either over/under-estimate them

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What is the formula for over/under-estimated omitted variable bias?

A

bias=B+/-corr(x1,x2)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

T/F:Including irrelavent explanatory variables can increase precision.

A

False: it may increase multicollinearity, which increases the variance in the OLS estimator

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Define the Gauss-Markov assumption

A

When MLR1-MLR5 are satisfied, then regression is the Best Linear Unbiased Estimator

aka has the smallest variation among all unbiased (linear) estimators

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

What is the formula for t-scores in hypothesis testing?

A

t=(B-a)/SE(B)

B=estimated coefficient
a=hypothesized number, normally 0

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

T/F: The larger the error variance, the larger the variance of B.

A

True: var=chi^2/(SST(1-R^2)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

T/F: The larger the SST, the larger the variance of B.

A

False: var=chi^2/(SST(1-R^2)

higher SST leads to smaller variance

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

State the null hypothesis and altervative for one-sided and two-sided.

A

H0: B = a
Ha One-sided: B > or < a
Ha Two-sided: B != a

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Is bathrooms stastically significant at the 5% level?

log(price) = 8.07+ 0.054rooms+0.22baths+0.298log(area)

Where bathrooms SE is 0.038

A

Yes, we can reject the null

0.22/0.038=5.79

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Dermine whether three variables are jointly significant

F(3,137)=9.46
Prob > F = 0.0000

A

It is unlikely that all coefficients are zero, meaning we can reject the null and they are jointly significant.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Determine whether two variables are jointly significant

F(2,137) = 1.17
Prob > F = 0.3143

A

Fail to reject the null, and unlikely they are jointly significant

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

What happens when we have homoskedasticity in our model?

A

1) OLS is no longer BLUE
2) usual formula for variance can’t be used
3) The usual standard errors are wrong
4) Unbiasedness is unaffected

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Why can’t you use heteroskedasticity-robust standard errors all the time?

A

In small samples the SE become less accurate and corresponding test statistics do not have the correct distributions

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Given the following hettest output,

chi2(1) = 62.55
Prob > chi2 = 0.000

determine if heteroskedasticity is present.

A

Reject the null and we have heteroskedasticity

H0: error variance does not depend on x
HA: MLR.5 fails

How well did you know this?
1
Not at all
2
3
4
5
Perfectly