HANDOUT 2 Flashcards

1
Q

Serial correlation =

A

the presence of some form of linear dependence over time for some series Zt.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Auto-Correlation function =

A

a pictorial representation of this linear dependency over time. It measures the correlation between Zt and Zt-k for different values of k.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Corr(Zt, Zr-k) formula

A

= COV(Zt, Zt-k)/sqrt[V(Zt) V(Zt-k)]
V(Zt) = V(Zt-k) = gamma 0 by stationarity
COV(Zt, Zt-k) = gamma k since only depends on distance apart by stationarity.
So Corr = gamma k / gamma 0 = pk

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

If p=1 this means

A

Shock today, on a new path & never return to old equilibrium as never forget about the shock.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

If p=0 this means

A

Shock today, next period no memory of it = immediate adjustment back to equilibrium.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

A change in X1t on Yt in a model with no lags means…

A

change in X1t only affects Y today

At t+1, immediately adjust back to equilibrium.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

White noise process formula

A

Zt = €t

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

E(Zt) for white noise

A

E(Zt) = 0

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

V(Zt) for white noise

A

V(Zt) = V(€t) = sigma^2

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

COV(Zt, Zt-k) for white noise

A

= 0 since E(€t.€t-k)=0

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

P1 to Pk for white noise

A

P0 = 1; all Pk = 0

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

ACF for white noise

A

Shock at period 0, immediately return back to equilibrium by period 1.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

What does an ACF show?

A

The proportion of a shock remaining k periods later. The correlation compared to period 0 when the shock hits.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

AR1 model formula

A

Zt = phi Zt-1 + €t

one lag of dependent variable

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

What condition must we impose on phi for AR1 and why?

A

I phi I < 1

For stationarity - otherwise shock will never dissipate out of system.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

E(Zt) for AR1

A

E(Zt) = phi E(Zt-1) + E(€t)
E(Zt) = E(Zt-1) by stationarity
(1-phi)E(Zt) = 0
Since phi<1, must be E(Zt) = 0

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

V(Zt) for AR1

A

V(Zt) = phi^2 V(Zt-1) + V(€t) + 2COV(Zt-1, €t)
V(Zt) = V(Zt-1) by stationarity
(1 - phi^2)V(Zt) = sigma^2
V(Zt) = sigma^2 / (1 - phi^2)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

COV(Zt, Zt-1) for AR1

A
E(Zt.Zt-1) as zero mean
E[Zt-1(phiZt-1 + €t)]
= phi V(Zt)
gamma 1 = phi gamma 0
= phi sigma^2 / (1 - phi^2)
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

Corr(Zt, Zt-1) for AR1

A

Corr = gamma 1 / gamma 0 = Phi

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
20
Q

P2 for AR1

A

P2 = Phi^2

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
21
Q

Pk for AR1

A

Pk = Phi^k

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
22
Q

2 shapes for AR1 look like

A

In both cases mod Phi< 1
If Phi>0, smooth decay to 0
If Phi<0, oscillations by still end up at 0.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
23
Q

Roots for AR1 using lag operator

A

(1 - phiL)Zt = €t
Solve 1 - phi L = 0
V = L^-1
V = phi - AR1 has 1 root

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
24
Q

Why when solving for roots, do we solve for L^-1 and not L?

A

Because if we solved for L, the root would > 1 so wouldn’t be stationary.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
25
Q

AR2 model formula

A

Zt = phi1 Zt-1 + Phi2 Zt-2 + €t

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
26
Q

Roots for AR2

A

1 - phi1 L - phi2 L^2 = 0
V = L^-1; V^2 = L^-2
V^2 - phi1 V - phi2 = 0
solve by quadratic formula - 2 roots for AR2

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
27
Q

E(Zt) for AR2

A

E(Zt) = 0

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
28
Q

V(Zt) for AR1

A

gamma 0 = phi1 gamma 1 + phi2 gamma 2 + sigma^2

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
29
Q

Yule walker equations x 2 for AR

A
  1. gamma 1 = phi1 gamma 0 + phi2 gamma1

2. gamma 2 = phi1 gamma 1 + phi2 gamma0

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
30
Q

How many possible ACFs for AR2?

A

4

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
31
Q

P1 = Corr(Zt, Zt-1) for AR2

A

phi1 / (1 - phi2)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
32
Q

MA1 model formula

A

Zt = theta €t-1 + €t
Average of 2 shocks
1 period memory only

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
33
Q

What conditions do we need on theta for MA1? Why?

A

NO conditions - an MA model is stationary by definition.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
34
Q

E(Zt) for MA1

A

0

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
35
Q

V(Zt) for MA1

A

(1 + theta^2) sigma^2

36
Q

COV(Zt, Zt-1) for MA1

A

E[(€t + theta €t-1)(€t-1 + theta €t-2)]
All cross-products 0; 1 term in common
gamma 1 = theta sigma^2

37
Q

P1 for MA1

A

theta / (1 + theta^2)

38
Q

P2–>Pk for MA1

A

all 0 - 1 period memory only

39
Q

An AR1 can be written as…

A

AR1 = MA(infinity)

40
Q

Write AR1 as an MA(infinity)

A
(1 - phiL)Zt = €t
Zt = 1/(1-phi L) €t
a = 1, r = phi L
1 + phi L + (phi L)^2 + (phi L)^3 ...
Multiply by €t
Zt = €t + phi €t-1 + phi^2 €t-2 + phi^3 €t-3
= infinite MA process
41
Q

How do ACF and PACF differ?

A

ACF - always compare to Zt
PACF - what’s the additional effect of Zt-j, holding all lags before constant. Want to see direct effect over and above all other lags.

42
Q

How do we determine Ps for PACF?

A

regress Zt on Zt-1 & plot P11
regress Zt on Zt-1 & Zt-2, plot P22 only
regress Zt on Zt-1, Zt-2, Zt-3, plot P33 only
- These are partial effects

43
Q

What does the PACF for AR look like?

A

AR(j) = j non-zero terms; all else zero.

44
Q

Why is determining PACF for AR easy?

A

Because we already have Zt as a function of Zt-1 etc.

45
Q

If we have an AR2, how do we find P11?

A

P11 = regression of Zt on Zt-1 only but we also have Zt-2.
Standard omitted relevant variable formula:
P11 = phi 1 + phi2[cov(Zt-1, Zt-2)/Var(Zt-1)]

46
Q

Why is it harder to find PACF for MA?

A

Because we do NOT already have Zt as a function of Zt-1 etc. Need to get rid of €t.

47
Q

Use lag operator to find PACF coefficients for MA1

A
Zt = €t + theta €t-1
Zt=(1+thetaL)€t
€t = 1/(1+thetaL)Zt - a=1, r=-thetaL
€t = Zt - theta Zt-1 + theta^2 Zt-2 - etc
Zt = theta Zt-1 - theta^2 Zt-2 ... + €t
P11 = theta + bias
P22 = theta^2 + bias etc.
48
Q

How are AR1 and MA1 similar in terms of PACF and ACF?

A

ACF MA = PACF AR

ACF AR = PACF MA

49
Q

When we have serial correlation but NO lagged dependent variable, is OLS unbiased?

A

YES - we only need E(€t I X) = 0 for unbiasedness.

50
Q

So what is the problem with OLS when we have serial correlation but NO lagged dependent variable?

A

The variance estimates are wrong
Because the cross-products COV(€t, €s)≠0
So OLS gives wrong SE = wrong t-ratios = all hypothesis testing is wrong.

51
Q

V(b1) formula when we have no lagged dependent variable but serial correlation

A

V(b1) = sigma^2 sum Wt^2 + 2 sum t=1,…,T

sum s=t+1,…,T [WtWs gamma s-t]

52
Q

Solution to issue with serial correlation with NO lagged dependent variable

A

Use “Newey-West’s Heteroscedastic Autocorrelation Consistent Standard Errors” (HACSE)

53
Q

When we have serial correlation AND lagged dependent variable, is OLS unbiased?

A

NO: COV(Yt-1, €t-1)≠0, COV(€t, €t-1)≠0 if we have serial correlation; so COV(Yt-1, €t)≠0

54
Q

E(b1) formula when we have a lagged dependent variable and serial correlation

A

E(b1) = B1 + COV(Yt-1, €t) / Var(Yt-1)

55
Q

SO: when we have a lagged dependent variable AND serial correlation, is OLS any good?

A

NO - it is BIASED & INCONSISTENT

56
Q

2 tests for detecting serial correlation

A
  1. Breusch-Godfrey test

2. Durbin Watin Statistic

57
Q

What does the breusch-godfrey test assume?

A

We assume that we know the form of the serial correlation: €t = phi1 €t-1 + phi2 €t-2 etc. + Rt (well-behaved error term)

58
Q

€t is unobserved so we use…

A

the residuals et

59
Q

Step 1 for breusch-godfrey test

A

Estimate the original equation by OLS and save residuals:

et = yt - (b0 + b1x1t + b2x2t)

60
Q

Why can we use OLS for step 1 of breusch-godfrey test?

A

Because we test under H0 so we assume no serial correlation hence our OLS coefficients are OK.

61
Q

Step 2 for breusch-godfrey test

A

regress residuals on lagged residuals

et = sum j=1,…,p phi j et-j

62
Q

What is the DOF problem with breusch-godfrey test?

A

We do NOT have n observations on our residuals so Stata gets the DOF wrong.
Because we have lags, there are missing values. If we have et-p lags, we only have t-p observations.

63
Q

How do we solve the DOF problem for breusch-godfrey test?

A

regress residuals on lagged residuals AND an intercept & original set of explanatory variables.

64
Q

DOF for breusch-godfrey test

A

DOF = (T - P) - (P + no parameters from intercept & original explanatory variables)

65
Q

If we add an additional lagged explanatory variable into the model, how does this affect DOF for breusch-godfrey test?

A

Each additional lag –> DOF falls by 2.
Because we lose an observation + gain an extra restriction from including the explanatory variable in the test regression.

66
Q

2 test statistics for breusch-godfrey test

A
  1. the usual F-test

2. Lagrange multiplier test

67
Q

F-test statistic for breusch-godfrey test

A

F = [(RssR - RssU) / p] / [RssU / [(T-P) - (P + parameters from original model)]

68
Q

H0 for breusch-godfrey test

A

H0: all Phi j = 0 - no serial correlation
H1: any Phi j ≠ 0 - serial correlation

69
Q

lagrange multiplier test for breusch-godfrey test

A

LM = (T - P) R^2
T - P = no observations on residuals
R^2 = from auxiliary regression i.e. our et = test

70
Q

What value of P should we choose i.e. how many lags?

A

p = 1 annual data
p=4 quarterly data
p=12 monthly data (but should be told in Q)

71
Q

How does stata do the F-stat for breusch-godfrey test?

A

F = chi^2 g /g / chi^2 k / k
As k–> infinity, F –> Chi^2 g/g
So stata does F = chi^2 p / p = LM / p

72
Q

Problem with breusch-godfrey test

A

low powered since we assume form of serial correlation = we often find no serial correlation when there actually might be.

73
Q

Durbin watson statistic

A

DW = sum(et - et-1)^2 / sum et^2
DW approx = 2(1 - phi 1)
where phi 1 = coefficient on et-1 when reg et on et-1.

74
Q

H0 for DW and DW value under H0

A

H0: phi1 = 0 –> DW = 2

75
Q

H1 for DW and DW value under H1

A

H1: Phi ≠ 0
Phi>0 –> 1, DW –> 0
Phi<0 –> -1, DW –> 4

76
Q

So DW takes values between

A

0 < DW < 4

77
Q

When do we reject H0 for DW?

A
  1. if test stat is between 0 and lower CV (+VE serial correlation)
  2. or if test stat is between 4 and 4-dL (-VE serial correlation)
78
Q

Inconclusive regions for DW

A
  1. test stat between dL and dU

2. test stat between 4-dU and 4-dL

79
Q

2 regions for do not reject for DW

A
  1. test stat between dU and 2

2. test stat between 2 and 4-dU

80
Q

2 problems with DW test

A
  1. low powered

2. including a lagged dependent variable makes DW biased towards 2 i.e. biased towards accepting H0.

81
Q

What test do we do instead of DW if we have a lagged dependent variable?

A

Durbin’s h test
h = phi sqrt[n / 1 - nS^2]
S^2 = OLS variance estimate for lagged dependent.
phi = 1st order autocorrection coefficient.

82
Q

CVs from what distribution for durbin’s h test?

A

N(0,1) normal distribution

83
Q

If a question says “data on variables is available from…” what does this mean?

A

This period has not accounted for missing values due to lags. So the period we actually can estimate the model over will be smaller due to lags.

84
Q

If a question says “the model is estimated over…” what does this mean?

A

This period already takes into account the lags so it has been reduced from the initial data collection period. If we have 3 lags on X1t, this means that data must’ve been collected over the period + 3 quarters (?) before as well.

85
Q

Apparent serial correlation can be caused by?

A
An omitted relevant variable
Means COV(Vt, Vt-1) ≠ 0 for false model error term since it contains the omitted variable.