Lecture 10 Flashcards

(18 cards)

1
Q

Difference between this week and last week regression model

A

We now extend it to when there are more than 2 time periods
- Ai is the unobserved, individual-specific fixed effect
- Uit is the idiosyncratic error term, which varies over time, assumed to be mea independent of xit

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

How do we first difference now, with T units/models?

A

To remove, ai, subtract (t-1)th regression from the tth one
- now have new pooled regression, with redefined parameters
- pooled OLS in the differences regression will be consistent if E[change(uit)|change(xit)] = 0, which is implied by the strict exogeneity assumption -> E[uit|xi1,…,xiT] = 0

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

The role of autocorrelation
- how does this change for repeated cross sections and panel data

A

If errors in one time period are correlated with errors in a different time period, it is auto correlated
- in repeated cross sections, a new sample is drawn independently in each sample, means errors from different time periods are uncorrelated
- in panel data, same individuals are followed over time, so outcomes of an individual in one period likely to correlate with outcomes in another
- AC if cov(change(uis), change(uit)) is not = 0

Means SE will be biased

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

When will first-differences errors be auto correlated?

A

First differencing creates overlapping terms in differences errors, introducing negative autocorrelation in the change in Uit

  • with an autoregressive model, autocorrelation becomes stronger when errors follow a persistent process.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

But what are the consequences of autocorrelation?

A
  • variances of OLS estimators will contain additional terms due to autocorrelation
  • HR SEs rely on assumption that all observations in our sample are mutually independent, but autocorrelation violates this
  • therefore, HR SEs become inconsistent in the presence of autocorrelation
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

With panel data, how can we adjust the HR SEs to accounr for autocorrelation?

A
  • in panel data, each unit, so lets say a city, individual or firm has T observations across time
  • within a unit, observations likely auto correlated, forming a cluster for that unit
  • clustered SEs account for these intra-cluster dependencies, while assuming independence between clusters

Above should be used for any PD estimator, including DiD with FE

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Fixed effects/Within estimator:
-> Yit = B1xit + ai + uit
-> mean(yi) = B1mean(xi) + ai + mean(ui)

A

Subtract bottom from top:
Yit’’ = B1xit’’ + uit’’

  • unbiased/consistent if E[uit’’|xit’’] = 0, implied by strict exogeneity assumption
  • we need variation in xit over time, for each individual i, otherwise the deviation from mean is 0 and B1 cannot estimated

Requires strict exogeneity

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Within or first-differencing estimator - which to use?

A

If T=2, two estimators are identical
- under cov(uit,uis|Xi,ai) = 0 is true, the within estimator is BLUE for any T>/2, and in large samples, use normal t and F

  • under cov(change(uit),change(uis)|Xi,ai), first difference estimator is BLUE for T>/2, again in large samples, use normal t and F

Both target the same problem - OVB due to ai correlated with xj, in any case still need to cluster the SEs

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Scenario 1: change(ui2) and change(ui3) are uncorrelated

A

The usual SEs can be used

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Scenario 2: change(ui2) and change(ui3) are correlated

A

We say they are autocorrelated, and thus must use clustered SEs

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

How to derive large sample distribution of regression and its standard errors

A
  • have the B1^ = (…/…)
  • sub in change(Yit) = B1change(xit) + change(uit)
  • let Vi_ and Zi_ come in
  • can now use CLT to derive the large-sample distribution
  • get two different values for variance, if autocorrelated or not
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Random effects model

A

Unlike the fixed effects model, the RE model assumes that ai is uncorrelated with xit across all time periods
- under this assumption, ai is treated like a random variable rather than a fixed parameter
- if this holds, RE estimator is more efficient than FE as they use within + between group variation in xit

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Within group data is

A

How changes in xit over time affect Yit for the same individual

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Between group data shows

A

How changes differences in xit between individual affect the outcome

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

RE: errors in regression are autocorrelated as same ai is present in Vit across all time periods for an individual, how do we deal with this?

A
  • covariance is not 0 for different time periods, implying Vit and Vis are autocorrelated
  • this is a problem, as OLS assumes errors are independent across observations
  • if we just run pooled OLS, coefficients will be unbiased, but SEs will be wrong = bad inference

Need to carry out partial demeaning

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

FE or RE?

A

If cov(ai,xit) = 0
- both FE and RE are consistent
- but RE is more efficient
If not
- FE still consistent
- RE is biased and inconsistent

So can we determine if cov is 0 or not

17
Q

Hausman test

A

To decide which model to use:
H0: Cov(ai,xit) = 0, if true then RE is BLUE
H1: not 0, so only FE is consistent

T test: t = ((B1^re - B1^fe)/(se(B1^re - B1^fe))

18
Q

Partial demeaning

A

Yit - 0yi_ and xit - oxi_

0 = 1 - (ou)/(ou^2 + Toa^2)^0.5

If oa^2 is really high, then theta tends to 1, thus RE acts like FE
If oa^2 is really low, then theta tends to 0, thus RE acts like OLS