Time Series Metrics Flashcards
Weak stationarity
If all joint moments up to n are time invariant. 1: mean, 2: autocovariance etc. Weak is mean and autocovariance stationary.
Strict stationarity
Fy=(yt,yt+1…yt+p)=F(ys,ys+1…) for all t and s. Follows exact same distribution at all points in time!
Check for AR stability
Lag polynomial (ignore constant) and roots out of unit circle
Serial correlation tests
Durbin-Watson (tests first order),
Breusch-Godfrey (more general)
Durbin Watson hypothesis
Test h0: φ=0, where et=φet-1+u.
Test if error yesterday and today are related
Unbiasedness condition
Strict exogeneity (often unlikely)
Consistency condition
Just contemporaneous exog
Weak dependence
Time dependence dies with time. Autocov to 0 as tau infinity
Asymptotic normality requires
Contemporaneous homo, no serial correlation, weak dependence, contemp exog
Autocovariance function
Cov(Yt,Yt-tau)
Or just depends on tau if we have covariance stationarity
Autocorrelation function
Corr so Con(Yt,Yt-tau)/(Root var of each)
This is autocov(tau)/autocov(0)
Draw as a correlogram
Test for white noise?
Q stats
Box Pierce Q stat
Check if first m autocorrelations are jointly 0.
From white noise test, T*Sum of 1 to m Autocorrelationhat (tau) squared ~a~ Chi squared 1.
Ljing Box Q stat
T(T+2)Sum of for all m 1/(T-tau) Autocorrelationhat (tau) squared ~a~ Chi squared m. Optimal m is roughly root T. Trade off of testing enough autocorrelations with not including poorly estimated ones!
AR(P)
Some weight on past realisations of Yt
MA(Q)
Moving average (weighted) of past errors
Lag polynomial
Important! L^mYt=Yt-m
AR(1) properties
Mean =0
Var= Sigma^2 Sum of eta^2i
= sigma^2 / (1-eta^2). This is because it’s the sum of variances! Remember that it’s a square on the bottom when we take it out of the variance!
Check you can derive
AR(1) Autocov
Derive!
Express in terms of errors and can get: eta^tau * Var(Yt)
When is AR(p) stable?
If all roots when solving the lagpolynomial are outside of the unit circle!
Moving average
Past shocks, not past observations
MA(1) properties
Mean=0, Var = (1+theta^2)sigma^2
Derive!
If mod theta <1 influence of a shock down. Thus cov stationary and weakly dependant!
MA: Invertible?
If mod theta < 1. We can work out the values of shocks from observations!
MA(q) invertible?
Again lag polynomial solutions all lie outside unit circle!
Wold’s theorem
Yt is any zero mean cov stationary process, then can represent as B(L)epsilont.
ARMA
Elements of both AR and MA
Motivation for ARMA
Could arise from aggregation: Eg sum of AR processes are ARMA.
Often better approximation than AR or MA for same number of parameters!
Deterministic behaviour wrt time
Trends, seasonality. This violate stationarity! If trend stationary, removing trend leaves a trend stationary
Forecasting
Yt+hgivent: Minimise forecast error! Feasible forecast uses beta0hat, beta1hat!
Why can’t we just min MSE for forecast?
Uses sample error and so we would see overfitting to the data. Thus use information criteria to weight a better fit vs less complex model. Terms in front of MSE are penalty for more parameters!
Akaike Info Criterion
Min e^(2k/T) MSE
Bayesian IC
Min e^(kln(T)/T) MSE
Seasonality
Regression on seasonal dummies
Serial correlation
E(epsilont,epsilons given Xt,Xs) =/= for some t,s!
Problem of serial correlation
Violates assumption and thus variance of estimators is wrong and we get artificially small s.e. Similar to hetero
Durbin Watson Test Stat
2(1-corr(epsilont,t-1))
Check in relation to DL,DU,4-DU,4-DL. Remember if extreme: reject H0, Intermediate: inconclusive.
In middle: DNR
Breusch Godfrey process
OLS of Yt on Xits leading to fitted residuals
OLS of fitted residuals on lags.
Ho: jointly coefficients are 0.
Use F test: LM stat = (T-#)R^2 ~a~ Chi Squared #
BG adv/assumptions
Adv - Relax strict exog and normally dist errors
Assumes - Contemporaneous Exog, Conditional homoskedastic.
Positive serial correlation result
Default formulas underestimate true conditional variance!
Eg Var(betahat given X) depends + on Covariances!
Correction for serial correlation?
Homoskedacity/ autocorrelation consistent se
When can we recover epsilont
Stable, invertible
Why is forecast error underestimated?
We use estimates of true parameters. Just have to hope the effect of this is small
Forecast AR/ ARMA?
Use chain rule of forecasting! If we go back far enough, MA will disappear!
Non stationarity meaning?
E(Yt)=/=E(Yt+h
Rand walk with drift:
Yt=Yt-1 + gamma + error.
Derive what this means Yt is. Also error and expectation!
DF
Integrated series
I(0) Differencing leads to stationarity!!
Problem of rand walk
AR process with eta = 1.
E(etahat) is roughly = 1-(5.3/T) ! Biased (but consistent at least)
Test for unit roots
Dickey Fuller test
DF test
Yt=alpha+(1+beta)Yt-1 + error
Betahat/se(betahat) vs DF crit!
One sided test vs beta<0.
Crucially, there are crit values for no alpha, no trend/ alpha, no trend / alpha and trend!
Recent research suggests we should consider more that just linear time trend!
Augmented DF
Adjusts for possible serial correlation by including lags of change in dependent variable
Augmented DF set up?
Delta Yt = alpha + gamma t + beta Yt-1 + Sum of p differences plus error
t stat is betahat / se(betahat)
If we are in AR(p)
Remember we must test if sum of ALL lags = 1
Augmented DF issue
Sluggish mean reversion is difficult to distinguish from unit root!
To empirically test I(1) we must establish
DNR Yt has unit root and Reject delta Yt has a unit root
Distributed lag model and assumptions for causal
Considers lags of Xt. We should consider immediate vs long run impacts!
- Xt, Yt stationary and weakly dependant
- Strict exog
ADL(p,q)
Lags of both X,Y.
Check you can derive immediate vs h period vs LR (this is if it’s a permanent change) impact of a change to X for eg ADL(1,1)
What is problem if Yt,Xt both have a time trend?
Spurious correlation Yule 1926. If we just run OLS, we do not obtain consistent results!
AR(p)
Seperate Yt into constituent parts then do cov of each bit. Will be an infinite series!