Lecture 5 Flashcards

1
Q

What are the 4 CLRM assumptions?

A
  1. E(ei|x)=0, this means the average error term, conditional on x=0
  2. Var(ei|x)= Theta squared, Variance is constant, this means for different points of X , e has the same spread. The difference between the actual points and the regression line is constant.
  3. Cov(ei ej|x)=0, for I does not equal J. This shows us that the regression error (difference between actual points and linear regression line) for person I tells us nothing about the regression error for person j.
  4. ei|x - N(0, Theta sqaured) The regression error follows a normal distribution.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

How do you test whether you can combine two regression’s from different periods together?

A
  1. Ho: Bo=Yo,….., Bk=Yk H1:Bj doesn’t equal Yj for at least one of j
  2. We then run 3 regressions to test Ho:
  3. Y= ao + a1xi1 +….akxik + ei, for k=1….,n ( This is for all the data, restricted as we are imposing the null hypothesis of assuming the coefficients are equal)
  4. Y=Bo+B1Xi1 +…..+ BkXik + ei for i=1….n1 (This is a regression for the first data set, unrestricted as you’re not imposing Ho
  5. Y=Yo + Y1xi1 + YkXik + ei, for i=n1+1…..,n (This is a regression for the second data set, and is unrestricted)
  6. Compute Rssu(RSS2 +RSS3), and RSSr=RSS1
  7. Compute the F statistic:

(RSSr-RSSu)/(K+1) / RSSu/(n-2(k+1)) ~ F k+1, 2(k+1)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Why are there k+1 degrees of freedom when there is a predictive failure test

A

For numerator, the unrestricted model there are k+1 restrictions in the null hypothesis (Bo, B1, B2…Bk) so K+1 restrictions. For unrestricted model: We use k+1 again but we have 2 regressions so it goes to 2(k+1)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

How do you deal with dummy variables?

A

If we suspect Jth observation is an outlier. We run a regression Yi = Bo + B1Xi + YDij + ei Where Dji=1, if J=1 and J=1 Dji=0 if J does not equal 1

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

What are the 2 ways of running a predicted values test?

A
  1. Run a regression with dummy variables for the second sample, and test the hypothesis that Ho: Yj=0, for j=n+1,….,n+n1. Then save the RSSu from this regression. Then compute RSSr from Yi=Bo + B1Xi1 + …. + BkXik + ei for I=1,….,n+n1. F statistic is (RSSr - Rssu)/n1 /RSSu/(n-k-1)
  2. Run a regression separately; one regression for the first sample, then run a regression for the 2 samples combine. In the second sample you are assuming Ho is true therefore you’re imposing the restriction so this model is restricted. Calculate the F statistic: (RSSr-RSSu)/n1 / (RSSu)/(n-k-1)
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Why is the DOF N-(K+1)

A

Sample size in N, and DoF is the number of restrictions you are testing. You are testing B1….Bk SO that’s k plus you are testing Bo so that’s that K+1

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

How do you show RSS3=RSS4

A

3: Yi=ao + a1xi1 +….+ akXik + Sum YjDji + ei

ao=bo,….,bk if this is true taking the residual

RSS3 = Sum (Yi-ao-….-akXik - Sum YjDji)^2, the same of YjDji=0 therefore, RSS4=Sum (Yi-bo-….-bkXik)^2 + 0 =RSS4

See notes

How well did you know this?
1
Not at all
2
3
4
5
Perfectly