Time Series Metrics Flashcards

1
Q

Weak stationarity

A

If all joint moments up to n are time invariant. 1: mean, 2: autocovariance etc. Weak is mean and autocovariance stationary.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Strict stationarity

A

Fy=(yt,yt+1…yt+p)=F(ys,ys+1…) for all t and s. Follows exact same distribution at all points in time!

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Check for AR stability

A

Lag polynomial (ignore constant) and roots out of unit circle

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Serial correlation tests

A

Durbin-Watson (tests first order),
Breusch-Godfrey (more general)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Durbin Watson hypothesis

A

Test h0: φ=0, where et=φet-1+u.
Test if error yesterday and today are related

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Unbiasedness condition

A

Strict exogeneity (often unlikely)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Consistency condition

A

Just contemporaneous exog

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Weak dependence

A

Time dependence dies with time. Autocov to 0 as tau infinity

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Asymptotic normality requires

A

Contemporaneous homo, no serial correlation, weak dependence, contemp exog

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Autocovariance function

A

Cov(Yt,Yt-tau)
Or just depends on tau if we have covariance stationarity

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Autocorrelation function

A

Corr so Con(Yt,Yt-tau)/(Root var of each)
This is autocov(tau)/autocov(0)
Draw as a correlogram

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Test for white noise?

A

Q stats

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Box Pierce Q stat

A

Check if first m autocorrelations are jointly 0.
From white noise test, T*Sum of 1 to m Autocorrelationhat (tau) squared ~a~ Chi squared 1.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Ljing Box Q stat

A

T(T+2)Sum of for all m 1/(T-tau) Autocorrelationhat (tau) squared ~a~ Chi squared m. Optimal m is roughly root T. Trade off of testing enough autocorrelations with not including poorly estimated ones!

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

AR(P)

A

Some weight on past realisations of Yt

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

MA(Q)

A

Moving average (weighted) of past errors

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

Lag polynomial

A

Important! L^mYt=Yt-m

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

AR(1) properties

A

Mean =0
Var= Sigma^2 Sum of eta^2i
= sigma^2 / (1-eta^2). This is because it’s the sum of variances! Remember that it’s a square on the bottom when we take it out of the variance!
Check you can derive

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

AR(1) Autocov

A

Derive!
Express in terms of errors and can get: eta^tau * Var(Yt)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
20
Q

When is AR(p) stable?

A

If all roots when solving the lagpolynomial are outside of the unit circle!

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
21
Q

Moving average

A

Past shocks, not past observations

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
22
Q

MA(1) properties

A

Mean=0, Var = (1+theta^2)sigma^2
Derive!
If mod theta <1 influence of a shock down. Thus cov stationary and weakly dependant!

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
23
Q

MA: Invertible?

A

If mod theta < 1. We can work out the values of shocks from observations!

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
24
Q

MA(q) invertible?

A

Again lag polynomial solutions all lie outside unit circle!

25
Wold's theorem
Yt is any zero mean cov stationary process, then can represent as B(L)epsilont.
26
ARMA
Elements of both AR and MA
27
Motivation for ARMA
Could arise from aggregation: Eg sum of AR processes are ARMA. Often better approximation than AR or MA for same number of parameters!
28
Deterministic behaviour wrt time
Trends, seasonality. This violate stationarity! If trend stationary, removing trend leaves a trend stationary
29
Forecasting
Yt+hgivent: Minimise forecast error! Feasible forecast uses beta0hat, beta1hat!
30
Why can't we just min MSE for forecast?
Uses sample error and so we would see overfitting to the data. Thus use information criteria to weight a better fit vs less complex model. Terms in front of MSE are penalty for more parameters!
31
Akaike Info Criterion
Min e^(2k/T) MSE
32
Bayesian IC
Min e^(kln(T)/T) MSE
33
Seasonality
Regression on seasonal dummies
34
Serial correlation
E(epsilont,epsilons given Xt,Xs) =/= for some t,s!
35
Problem of serial correlation
Violates assumption and thus variance of estimators is wrong and we get artificially small s.e. Similar to hetero
36
Durbin Watson Test Stat
2(1-corr(epsilont,t-1)) Check in relation to DL,DU,4-DU,4-DL. Remember if extreme: reject H0, Intermediate: inconclusive. In middle: DNR
37
Breusch Godfrey process
OLS of Yt on Xits leading to fitted residuals OLS of fitted residuals on lags. Ho: jointly coefficients are 0. Use F test: LM stat = (T-#)R^2 ~a~ Chi Squared #
38
BG adv/assumptions
Adv - Relax strict exog and normally dist errors Assumes - Contemporaneous Exog, Conditional homoskedastic.
39
Positive serial correlation result
Default formulas underestimate true conditional variance! Eg Var(betahat given X) depends + on Covariances!
40
Correction for serial correlation?
Homoskedacity/ autocorrelation consistent se
41
When can we recover epsilont
Stable, invertible
42
Why is forecast error underestimated?
We use estimates of true parameters. Just have to hope the effect of this is small
43
Forecast AR/ ARMA?
Use chain rule of forecasting! If we go back far enough, MA will disappear!
44
Non stationarity meaning?
E(Yt)=/=E(Yt+h
45
Rand walk with drift:
Yt=Yt-1 + gamma + error. Derive what this means Yt is. Also error and expectation! DF
46
Integrated series
I(0) Differencing leads to stationarity!!
47
Problem of rand walk
AR process with eta = 1. E(etahat) is roughly = 1-(5.3/T) ! Biased (but consistent at least)
48
Test for unit roots
Dickey Fuller test
49
DF test
Yt=alpha+(1+beta)Yt-1 + error Betahat/se(betahat) vs DF crit! One sided test vs beta<0. Crucially, there are crit values for no alpha, no trend/ alpha, no trend / alpha and trend! Recent research suggests we should consider more that just linear time trend!
50
Augmented DF
Adjusts for possible serial correlation by including lags of change in dependent variable
51
Augmented DF set up?
Delta Yt = alpha + gamma t + beta Yt-1 + Sum of p differences plus error t stat is betahat / se(betahat)
52
If we are in AR(p)
Remember we must test if sum of ALL lags = 1
53
Augmented DF issue
Sluggish mean reversion is difficult to distinguish from unit root!
54
To empirically test I(1) we must establish
DNR Yt has unit root and Reject delta Yt has a unit root
55
Distributed lag model and assumptions for causal
Considers lags of Xt. We should consider immediate vs long run impacts! - Xt, Yt stationary and weakly dependant - Strict exog
56
ADL(p,q)
Lags of both X,Y. Check you can derive immediate vs h period vs LR (this is if it's a permanent change) impact of a change to X for eg ADL(1,1)
57
What is problem if Yt,Xt both have a time trend?
Spurious correlation Yule 1926. If we just run OLS, we do not obtain consistent results!
58
AR(p)
Seperate Yt into constituent parts then do cov of each bit. Will be an infinite series!