Time Series Flashcards

(46 cards)

1
Q

time series

A

gives us value of the same variable Y at different time periods

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

lags

A

Yt-1, Yt-2 etc

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

first difference

A

change in value of Y between time t-1 and t

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

autocorrelated

A

when a series is correlated with its lags

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

volatility clustering

A

when there are periods of high volatility followed by periods of low volatility

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

breaks

A

abrupt or occur slowly due to econ policy or changes in structure of economy

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

serial correlation (autocorrelation)

A

correlation between error terms (at time t, t-1, t-2 etc) in regression model

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

exogeneity assumption

A

ut must be uncorrelated with all xts i.e. all explanatory variables (X) cannot respond to change in/past values of dependent variable (Y)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

no autocorrelation assumption

A

e.g if interest rate is unexpectedly high in one period it shouldn’t be high in the next period too

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

consequences of autocorrelation

A

OLS no longer BLUE

OLS se underestimated - CI too narrow - t ratio too large - p values too small - more likely to incorrectly reject null

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

testing autocorrelation

A

do regression of residuals et on their lagged values et-1

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

HAC

A

Heteroscedasticity and Autocorrelation Consistent standard errors (they take autocorrelation into account)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

conditional heteroscedasticity

A

variance of the error term is autocorrelated (ie.e when it’s high in one period it’s high in the next
arises when dependent variable has volatility clustering

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

AR(p) model

A

uses Yt-1 to forecast Yt. p = no. of lags

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

AR(p) model assumptions

A

conditional expectation of ut = 0 given past values of Yt

errors are serially uncorrelated

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

ADL(p,q) model

A

autoregressive distributed lag model
lagged values of dependent variable are included as regressors
p=lags of Yt, q=lags of additional predictor Xt

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

least squares assumptions for ADL

A

error term has conditional mean 0 given all the lags of regressors
random variables have a stationary distribution
no large outliers
no multicollinearity

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

stationarity

A

series Yt is stationary if its probability distribution doesn’t change over time

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

types of non-stationary

A

trends and breaks

leads to bias an inconsistency

20
Q

deterministic trend

A

variable is a linear function of time (would indicate that growth rate is constant over time)

21
Q

stochastic trend

A

trend is random and varies over time (UK GDP growth rate is not constant)

22
Q

random walk

A

Yt = Yt-1 + ut

if Yt follows an AR(1) with B1=1 then Yt contains a stochastic trend and is non-stationary

23
Q

random walk with drift

A

Yt = B0 + Yt-1 + ut

24
Q

problems with stochastic trends

A

biased coeff estimates
non-normal distributions of t-stat
spurious regressions

25
testing for a unit root
Dickey-Fuller test
26
Dickey-Fuller test
regressing Yt on its lag testing for the existence of a stochastic trend in the presence of a deterministic trend (we include an intercept and a time trend into spec on unit root test)
27
Augmented Dickey-Fuller test
if AR(1) model doesn't capture all the serial correlation in Yt then DF test is invalid
28
differencing
eliminates stochastic trend | transforming non-stationary time series to stationary
29
Chow test
to test fr breaks in ADL and DL models in a subset of parameters only
30
BIC (lag selection)
Bayes Information Criteria - want to choose p that minimises BIC
31
AIC (lag selection)
Akaike Information Criteria - want to choose p that minimises AIC
32
SSR (p)
sum of squared residuals of a model estimated with p lags
33
focus of forecasting
how good is a model at predicting future events (not to estimate causal effects)
34
to evaluate forecasting model (and compare different models)
adjusted R^2 RMSFE out-of-sample forecasting performance
35
RMSFE
the root mean squared forecast error = size of typical mistake we make when using forecasting model (smaller means better model)
36
adjusted R^2
how well does the model explain the variation in the dependent variable (higher means explains more of variation)
37
out-of sample forecasting performance
how well is the model performing in real time
38
forecast errors
mistake made when forecasting (not the same as predicted residuals)
39
pseudo out-of-sample forecasting
method for simulating real time performance of a fc model using historical data for the series Y up to period T
40
standard deviation of pseudo out-of-sample forecast errors provide
estimate of RMSFE
41
95% forecast interval
interval that contains the future value of the series 95% of the time
42
Granger causality tests
tests of the predictive content of the predictors in a forecasting model stat = F-stat the predictor "Granger causes" w/e
43
dynamic causal effect
follow time path of the effect of a shock over time e.g. effect of increasing IR on US GDP
44
DL Model
distributed lag model - used to estimate dynamic effect
45
DL model assumptions
exogeneity random variables have a stationary distribution no large outliers no multicollinearity
46
exogeneity
condition that guarantees that the estimated coeffs can be interpreted as causal effects - will not hold if there are omitted variables in error term that are correlated with past or present values - holds if lags don't effect w/e beyond last lag