Cointegration and ECM Flashcards
What is a Spurious Regression?
We have two variables (x and y) who at first glance are unrelated, but are in fact related through a third variable, z
What is the consequence of this relationship between x, y and z?
If we regress y on x we find a significant relationship
But when we control for z the partial effect of x on y becomes zero
Why does the partial effect of x on y become zero when z is introduced?
All the information is collected in z
Why are regressions using integrated time series likely to be spurious?
They produce significant relationships even when the variables are unrelated
It may include a time trend, but removing this may not solve the problem because there are other factors affecting that we do not account for
How do we make a regression non-spurious?
Through differencing or ECM
In a random walk process, will xt or yt be independent processes if εt is?
(note: xt = xt-1 + εt, yt = yt-1 + εt)
xt and yt will also be independent
What is the issue with a spurious regression?
Theoretically, x and y shouldn’t be related because our equations show they don’t affect one another.
However, looking at the table below, we can see xt has a significant t-statistic and affects yt in the OLS regression
Why can differencing two variables to create a non-spurious regression cause an issue?
It limits the scope of the questions we can answer
Is the relationship between two cointegrated nonstationary variables of the same order (i.e. I(1)) meaningful?
Yes, because they seem to evolve in a similar fashion over time
Why is cointegration between x and y sometimes useful?
Because the regression between two unit root processes may still create a stationary dependent variable, zt
Will xt∼I(1) and yt∼I(1) return to the initial value?
How does a cointegrating factor, β, affect this?
They both will have a tendency to wander around and not return to the initial value
zt = yt - βxt∼I(0) tends to return to its mean with regularity (mean reversion)
xt and yt are linked in such a way that they do not move too far from one another
How do we test for superconsistency?
- Assume β is unknown
2.
What does superconsistency demonstrate?
The OLS estimator can generate consistent estimates of β
but standard OLS formula for the variance of β^ will be incorrect so we should not use that for our hypothesis testing
However u^t = yt - β^xt can be used to conduct tests of whether the equation errors are stationary
What is the null hypothesis of the Engle-Granger test?
How do we perform the EG test?
Note that step 5. is the same test as an ADF test (to see if there is a unit root, if there is then u^t is nonstationary and we reject cointegration)
What is the consequence of yt∼I(1) and xt∼I(1) having no cointegration?
What is the consequence of yt∼I(1) and xt∼I(1) being cointegrated?
An example of EG test in practice
Is it possible to perform the Engle-Granger test for more than two variables?
Yes, but the critical values will change
How do we perform the EG-test with multiple variables?
What issues does the EG-test have?
- In some circumstances it can have low power
- Meaning it is difficult to reject the null even when it is false (type II error)
- The residuals should be stationary
- meaning ρ1 < 1
This occurs because the alternative (H1: Stationarity) can include some values very close to H0
What is the purpose of an ECM?
To describe the short-run dynamics consistent with the long-run dynamics
What is the Engle-Granger (1987) representation theorem?
If the series are cointegrated, there is a valid ECM representation
How do we rewrite this ADL as an ECM?
- Subtract yt-1 from both sides to get Δyt on the LHS and γ3 on the RHS
- Add and subtract α1xt-1 from the RHS to get γ1Δxt and γ2xt-1
How else can we identify the long run relationship between the variables or the linear combination between them with a formula?
Where:
yt-1-βxt-1 is integrated of order 0
This reparameterization of the ECM generates a model where all variables are I(0)
How can we test for cointegration when there is a possibility of generating ECMs?
γ3 is the important value!!!
What is the benefit of using ECM over EG-test?
ECM is a better dynamic specification than the static regression used to generate the residuals in the EG test
This test can also be shown to be more powerful than the EG test
A note when using the alternative approach for ECM discussed before
Note: the coefficient on the residual is the speed of adjustment i.e. in this model r3m and r6m become 15% closer each month
What advantages does the Johansen test have?
- It allows for mulitple cointegrating vectors when we have more than two variables in the model
- It has better power than the EG test
- It avoids the normalisation restriction necessary for both the EG and ECM approaches
–for both tests we impose that the coefficient on the current value of y is equal to one
–the result of the respective tests may be sensitive to this restriction
–there is no need to make such an assumption when we use the Johansen test
What are the conclusions we can draw about the rank of the VECM (with two variables) in Johansen test?
- If rank is 0, both variables are I(1) and there is no cointegration
- If rank is 1, both variables are I(1) but there is a cointegrating vector linking them, meaning that there is a linear transformation which is I(0)
- If rank is 2, both variables are I(0) without any further transformation
How does the Johansen test work in theory?
Basically starts by comparing rank 2 as the null hypothesis against rank 1, if we reject the null then we move onto making rank 1 the null and testing this against rank 0
How does the Johansen test work in practice?
How many cointegrating relationships can exist in a system of k variables, each of which is I(1)?
k-1