Time Series Metrics Flashcards

1
Q

Weak stationarity

A

If all joint moments up to n are time invariant. 1: mean, 2: autocovariance etc. Weak is mean and autocovariance stationary.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Strict stationarity

A

Fy=(yt,yt+1…yt+p)=F(ys,ys+1…) for all t and s. Follows exact same distribution at all points in time!

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Check for AR stability

A

Lag polynomial (ignore constant) and roots out of unit circle

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Serial correlation tests

A

Durbin-Watson (tests first order),
Breusch-Godfrey (more general)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Durbin Watson hypothesis

A

Test h0: φ=0, where et=φet-1+u.
Test if error yesterday and today are related

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Unbiasedness condition

A

Strict exogeneity (often unlikely)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Consistency condition

A

Just contemporaneous exog

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Weak dependence

A

Time dependence dies with time. Autocov to 0 as tau infinity

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Asymptotic normality requires

A

Contemporaneous homo, no serial correlation, weak dependence, contemp exog

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Autocovariance function

A

Cov(Yt,Yt-tau)
Or just depends on tau if we have covariance stationarity

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Autocorrelation function

A

Corr so Con(Yt,Yt-tau)/(Root var of each)
This is autocov(tau)/autocov(0)
Draw as a correlogram

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Test for white noise?

A

Q stats

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Box Pierce Q stat

A

Check if first m autocorrelations are jointly 0.
From white noise test, T*Sum of 1 to m Autocorrelationhat (tau) squared ~a~ Chi squared 1.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Ljing Box Q stat

A

T(T+2)Sum of for all m 1/(T-tau) Autocorrelationhat (tau) squared ~a~ Chi squared m. Optimal m is roughly root T. Trade off of testing enough autocorrelations with not including poorly estimated ones!

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

AR(P)

A

Some weight on past realisations of Yt

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

MA(Q)

A

Moving average (weighted) of past errors

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

Lag polynomial

A

Important! L^mYt=Yt-m

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

AR(1) properties

A

Mean =0
Var= Sigma^2 Sum of eta^2i
= sigma^2 / (1-eta^2). This is because it’s the sum of variances! Remember that it’s a square on the bottom when we take it out of the variance!
Check you can derive

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

AR(1) Autocov

A

Derive!
Express in terms of errors and can get: eta^tau * Var(Yt)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
20
Q

When is AR(p) stable?

A

If all roots when solving the lagpolynomial are outside of the unit circle!

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
21
Q

Moving average

A

Past shocks, not past observations

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
22
Q

MA(1) properties

A

Mean=0, Var = (1+theta^2)sigma^2
Derive!
If mod theta <1 influence of a shock down. Thus cov stationary and weakly dependant!

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
23
Q

MA: Invertible?

A

If mod theta < 1. We can work out the values of shocks from observations!

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
24
Q

MA(q) invertible?

A

Again lag polynomial solutions all lie outside unit circle!

25
Q

Wold’s theorem

A

Yt is any zero mean cov stationary process, then can represent as B(L)epsilont.

26
Q

ARMA

A

Elements of both AR and MA

27
Q

Motivation for ARMA

A

Could arise from aggregation: Eg sum of AR processes are ARMA.
Often better approximation than AR or MA for same number of parameters!

28
Q

Deterministic behaviour wrt time

A

Trends, seasonality. This violate stationarity! If trend stationary, removing trend leaves a trend stationary

29
Q

Forecasting

A

Yt+hgivent: Minimise forecast error! Feasible forecast uses beta0hat, beta1hat!

30
Q

Why can’t we just min MSE for forecast?

A

Uses sample error and so we would see overfitting to the data. Thus use information criteria to weight a better fit vs less complex model. Terms in front of MSE are penalty for more parameters!

31
Q

Akaike Info Criterion

A

Min e^(2k/T) MSE

32
Q

Bayesian IC

A

Min e^(kln(T)/T) MSE

33
Q

Seasonality

A

Regression on seasonal dummies

34
Q

Serial correlation

A

E(epsilont,epsilons given Xt,Xs) =/= for some t,s!

35
Q

Problem of serial correlation

A

Violates assumption and thus variance of estimators is wrong and we get artificially small s.e. Similar to hetero

36
Q

Durbin Watson Test Stat

A

2(1-corr(epsilont,t-1))
Check in relation to DL,DU,4-DU,4-DL. Remember if extreme: reject H0, Intermediate: inconclusive.
In middle: DNR

37
Q

Breusch Godfrey process

A

OLS of Yt on Xits leading to fitted residuals
OLS of fitted residuals on lags.
Ho: jointly coefficients are 0.
Use F test: LM stat = (T-#)R^2 ~a~ Chi Squared #

38
Q

BG adv/assumptions

A

Adv - Relax strict exog and normally dist errors
Assumes - Contemporaneous Exog, Conditional homoskedastic.

39
Q

Positive serial correlation result

A

Default formulas underestimate true conditional variance!
Eg Var(betahat given X) depends + on Covariances!

40
Q

Correction for serial correlation?

A

Homoskedacity/ autocorrelation consistent se

41
Q

When can we recover epsilont

A

Stable, invertible

42
Q

Why is forecast error underestimated?

A

We use estimates of true parameters. Just have to hope the effect of this is small

43
Q

Forecast AR/ ARMA?

A

Use chain rule of forecasting! If we go back far enough, MA will disappear!

44
Q

Non stationarity meaning?

A

E(Yt)=/=E(Yt+h

45
Q

Rand walk with drift:

A

Yt=Yt-1 + gamma + error.
Derive what this means Yt is. Also error and expectation!
DF

46
Q

Integrated series

A

I(0) Differencing leads to stationarity!!

47
Q

Problem of rand walk

A

AR process with eta = 1.
E(etahat) is roughly = 1-(5.3/T) ! Biased (but consistent at least)

48
Q

Test for unit roots

A

Dickey Fuller test

49
Q

DF test

A

Yt=alpha+(1+beta)Yt-1 + error
Betahat/se(betahat) vs DF crit!
One sided test vs beta<0.
Crucially, there are crit values for no alpha, no trend/ alpha, no trend / alpha and trend!
Recent research suggests we should consider more that just linear time trend!

50
Q

Augmented DF

A

Adjusts for possible serial correlation by including lags of change in dependent variable

51
Q

Augmented DF set up?

A

Delta Yt = alpha + gamma t + beta Yt-1 + Sum of p differences plus error

t stat is betahat / se(betahat)

52
Q

If we are in AR(p)

A

Remember we must test if sum of ALL lags = 1

53
Q

Augmented DF issue

A

Sluggish mean reversion is difficult to distinguish from unit root!

54
Q

To empirically test I(1) we must establish

A

DNR Yt has unit root and Reject delta Yt has a unit root

55
Q

Distributed lag model and assumptions for causal

A

Considers lags of Xt. We should consider immediate vs long run impacts!
- Xt, Yt stationary and weakly dependant
- Strict exog

56
Q

ADL(p,q)

A

Lags of both X,Y.
Check you can derive immediate vs h period vs LR (this is if it’s a permanent change) impact of a change to X for eg ADL(1,1)

57
Q

What is problem if Yt,Xt both have a time trend?

A

Spurious correlation Yule 1926. If we just run OLS, we do not obtain consistent results!

58
Q

AR(p)

A

Seperate Yt into constituent parts then do cov of each bit. Will be an infinite series!