Time Series. Flashcards

1
Q

Time series

A

A set of observations made at ordered time-points.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

The index set

A

Time points at which the process is defined

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

State space

A

set of values that the random variables Xt may take

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Aims of time series

A
  1. Draw inferences from time series 2. Examining the process generating the data 3. Forecasting
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

What is White Noise

A
  1. E(εt)=02. Cov(εt,εs) = σε iff t=0 else 0
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Gaussian Stochastic process

A

All of its marginal distributions (any part of distribution) are Gaussian

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

What is Gaussian White Noise

A

If WN is also Gaussian

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Are white noise shocks independent

A

White Noise shocks are only uncorrelated but not independent. Can have some dependents between absolute/squared shocks

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Why is Gaussian WN is independent

A

Normally distributed random variables are uncorrelated if and only if they are independent

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

What is Random Walk

A

Xt=Xt-1+εt

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Random walk with drift

A

Xt=a0+Xt-1+εt

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Weak Stationarity

A
  1. Xt has finite moments of second order for all t in Z2. E(Xt) = E(Xs) for all t,s in R3. Cov(Xt, Xs) = Cov (Xt+h,Xs+h) for all h in NIn Words First/Second order process variables don’t depend on time
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Strong stationarity

A

Joint distribution doesn’t depend on time For any consecutive m (from Z) and a lag h (from N) Xt1,…,Xtm and Xt1+h,…Xtm+h are identical

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

ACVF (AutoCoVariance Function)

A

γ(h)=Cov(Xt,Xt+h)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

ACF (AutoCorrelation Function)

A

ρ(h)=Corr(Xt,Xt+h)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Properties of ACF and ACVF

A
  1. Positive semidefinite 2. Symmetric
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

What does it mean if an ACF or ACVF matrix is not positive semidefinite

A

Process is not stationary

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

What do ACF and ACVF measure

A

Degree of dependence among the values of a time series at different times

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

What is the general approach to Time Series Modelling

A
  1. Plot the series and examine the main features of the graph, checking in particular whether there is: a) trend b)seasonal component c)any apparent sharp changes in behaviour d)any outlying observations2. Remove the trend and seasonal components to get stationary residuals 3. Choose model to fit the residuals 4. Forecasting will be achieved by forecasting the residuals and the inverting any transformations to arrive at forecasts of the original series
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
20
Q

Classical decomposition model

A

Xt=mt+st+Ytmt = trend functionst = seasonal componentYt=zero-mean random noise component

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
21
Q

What are deterministic component/s (signal) and what are stochastic component/s in the Classical decomposition model

A

mt and st are signalYt is noise

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
22
Q

What should do if Var increases

A

Apply preliminary transformations e.g. Log

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
23
Q

If models with trend, but no seasonality (Xt = mt + Yt) how estimate trend

A

Method 1: Trend estimation:a)Nonparametric: 1. Moving Average 2. Exponential smoothing b)Model based:Fitting a polynomial trendMethod 2: Trend elimination by differencing

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
24
Q

Constant Mean Model (CMM)

A

Xt = m + Yt where m is a constant. Big problem is that assigned weights are equal

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
25
What is the usual trade of between choosing large or small q
If mall - Faster reactionIf large - smaller variability
26
What method is used to estimate parameters in the polynomial trend model
Method of least squares
27
The lag-1 difference operator
∇Xt=Xt-Xt-1=(1-B)Xt
28
Backward shift operator
BXt=Xt-1
29
What is the problem of applying difference operator for a small sample size
Reduce sample size by 1 with each differencing
30
∇^2Xt
Xt-2Xt-1+Xt-2
31
Two methods to deal with Trend and Seasonality
Method 1: Trend and seasonality estimation a)Estimate the trend using a centered simple moving average b)Estimate the seasonal component c)(Optimal) Reestimate the trend of the deseasonalised data d)Estimate the noise using the (re)estimated trend and the estimated seasonal component. Method 2: Differencing
32
What if seasonal effect d is odd
q=(d-1)/2
33
What if seasonal effect d is even
q=d/2 but assign 0.5 weight to the first and last Xt
34
∇dXt
Xt-Xt-в
35
Is ∇d == ∇^d
No
36
What is the problem if h is close to T in the sample ACF/ACVF
Estimator is not very reliable
37
Rule of thumb for T and h in (sample ACF/ACVF)
T>50 and h
38
For Large T the sample ACF of an iid sequence with finite variance are approximately
~N(0,1/T)
39
CI for ACF
estimate +- 1.96/sqrt(T)
40
H:0 and H:1 for Portmanteau Test
H:0 ρ(1)=ρ(2)=...=ρ(m)=0H:1 at least 1 ρ is not equal to 0
41
How m is chosen for Portmanteau Test
Ad hoc
42
Power of a test
P(alternative | alternative is true)
43
How can test normality
Jarque-Bera testQ-Q plot
44
How does Q-Q plot works
It plots an empirical quantiles against theoretical ones. In particular the values in the sample of data, in order from smallest to largest, are plotted against F-^-1 ((i-0.5)/T), i=1,2,...,T and А is the cumulative distribution function of the assumed distribution. If Assumed distribution is accurate then expect a 45 degree line through (0,0)
45
For which processes do ACVF and ACF exist
Weakly Stationary
46
For which processes do sample ACVF and ACF exist
Non weakly stationary
47
What does convergence in mean square allows
for infinite linear combinations of past values
48
Autocorrelation matrix
Rk:= (ρ(i-j))i,j=1,...,kSketch
49
α(h) measures what
Left over correlation between Xt and Xt-h after accounting for their correlation with Xt-1, Xt-2,...,Xt-h+2,Xt-h+1
50
Markov process
E(Xt | Xt-1,Xt-2,...) = E(Xt | Xt-1)
51
Martingale process
E(Xt | Xt-1,Xt-2,...) = Xt-1
52
Example of Markov process
AR(1)
53
Example of Martingale process
RW without drift
54
Define MA(1)
Xt=εt+θεt-1
55
5 Properties of MA(1)
1. Xt is weakly stationary2. γ(h) = (1+θ)^2σ^2 h=0 θσ^2 |h|=1 0 otherwise3. ρ(h) = 1 h=0 θ/(1+θ^2) |h|=1 0 otherwise4. PACF5. |θ|<1 X is invertible with respect to ε εt=(Xt-μt)-θ(Xt-1-μ)+θ^2(Xt-2-μ)-+... |θ|>=1 X is not invertible with respect to ε |θ|>1 X is invertible with respect to WN z:=(Xt-μt)-1/θ(Xt-1-μ)+1/θ^2(Xt-2-μ)-+...
56
What is decaying and what cuts of for MA
ACF - DecaysPACF - Cuts off
57
How confidence interval for an empirical ACF is called
Bartlett
58
Write an equation for MA(inf)
Xt=εt+θ1εt-1+θ2εt-2+...
59
What is the other name for MA(inf)
General linear process
60
Define AR(1)
Xt=φXt-1+εt
61
How stationarity property for MA is different from AR
In the AR it is explicitly postulated in the definition while in the MA is it not presupposed
62
What happens to AR process if |φ|>1/=1/<1
<1 is weakly stationary>1 Can find the new white noise sequence for which Xt is stationary=1 Non stationary as is Random walk
63
If all roots of the corresponding auxiliary equation lie inside the unit circle then the process is
Stationary
64
MA(q) is invertible with respect to ε if and only if
Roots of the MA-polynomial lie outside of the unit circle
65
How is this equation and its roots are called? ψ(x)=x^p-φ1x^p-1-φ2x^p-2-...-φ^p=0
Characteristic equation and characteristic roots
66
What happens when the characteristic roots are complex numbers
Mixture of damping sine and cosine patterns
67
What is decaying and what cuts of for AR
ACF - Cuts off PACF - Decays
68
Define ARMA(1,1)
Xt=φXt-1+εt+θεt-1
69
What are the restrictions on the φ and θ in the ARMA (1,1)
φ+θ≠0 and both φ and θ are real numbers
70
What is the restriction imposed on the roots of AR and MA parts of the ARMA process
There can't be any common roots for the AR and MA process
71
When does ARMA have an infinite AR/MA representation
If the AR operator φ is stationary (i.e. all the roots of φ(x)=0 lie outside the unit circle) the ARMA process given by φ(B)Xt=θ(B)εt is indeed stationary and can be represented as an MA(inf) process. If the MA operator θ is invertible (i.e. all the roots of θ(x)=0 lie outside the unit circle) the ARMA process φ(B)Xt=θ(B)εt has an infinite AR representation.
72
For ARMA(p,q) what does ACF mimics when q=>p after lag q-p
AR(p)
73
For ARMA(p,q) what does PACF mimics when p=>q after lag p-q
MA(q)
74
Which selection criteria is/are consistent and which asymptotically efficient
AIC,AICc are asymptotically efficientBIC consistent
75
How do we choose a model based on AIC/BIC
Choose the one which has the lowest value