Time Series. Flashcards

1
Q

Time series

A

A set of observations made at ordered time-points.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

The index set

A

Time points at which the process is defined

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

State space

A

set of values that the random variables Xt may take

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Aims of time series

A
  1. Draw inferences from time series 2. Examining the process generating the data 3. Forecasting
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

What is White Noise

A
  1. E(εt)=02. Cov(εt,εs) = σε iff t=0 else 0
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Gaussian Stochastic process

A

All of its marginal distributions (any part of distribution) are Gaussian

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

What is Gaussian White Noise

A

If WN is also Gaussian

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Are white noise shocks independent

A

White Noise shocks are only uncorrelated but not independent. Can have some dependents between absolute/squared shocks

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Why is Gaussian WN is independent

A

Normally distributed random variables are uncorrelated if and only if they are independent

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

What is Random Walk

A

Xt=Xt-1+εt

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Random walk with drift

A

Xt=a0+Xt-1+εt

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Weak Stationarity

A
  1. Xt has finite moments of second order for all t in Z2. E(Xt) = E(Xs) for all t,s in R3. Cov(Xt, Xs) = Cov (Xt+h,Xs+h) for all h in NIn Words First/Second order process variables don’t depend on time
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Strong stationarity

A

Joint distribution doesn’t depend on time For any consecutive m (from Z) and a lag h (from N) Xt1,…,Xtm and Xt1+h,…Xtm+h are identical

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

ACVF (AutoCoVariance Function)

A

γ(h)=Cov(Xt,Xt+h)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

ACF (AutoCorrelation Function)

A

ρ(h)=Corr(Xt,Xt+h)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Properties of ACF and ACVF

A
  1. Positive semidefinite 2. Symmetric
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

What does it mean if an ACF or ACVF matrix is not positive semidefinite

A

Process is not stationary

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

What do ACF and ACVF measure

A

Degree of dependence among the values of a time series at different times

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

What is the general approach to Time Series Modelling

A
  1. Plot the series and examine the main features of the graph, checking in particular whether there is: a) trend b)seasonal component c)any apparent sharp changes in behaviour d)any outlying observations2. Remove the trend and seasonal components to get stationary residuals 3. Choose model to fit the residuals 4. Forecasting will be achieved by forecasting the residuals and the inverting any transformations to arrive at forecasts of the original series
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
20
Q

Classical decomposition model

A

Xt=mt+st+Ytmt = trend functionst = seasonal componentYt=zero-mean random noise component

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
21
Q

What are deterministic component/s (signal) and what are stochastic component/s in the Classical decomposition model

A

mt and st are signalYt is noise

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
22
Q

What should do if Var increases

A

Apply preliminary transformations e.g. Log

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
23
Q

If models with trend, but no seasonality (Xt = mt + Yt) how estimate trend

A

Method 1: Trend estimation:a)Nonparametric: 1. Moving Average 2. Exponential smoothing b)Model based:Fitting a polynomial trendMethod 2: Trend elimination by differencing

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
24
Q

Constant Mean Model (CMM)

A

Xt = m + Yt where m is a constant. Big problem is that assigned weights are equal

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
25
Q

What is the usual trade of between choosing large or small q

A

If mall - Faster reactionIf large - smaller variability

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
26
Q

What method is used to estimate parameters in the polynomial trend model

A

Method of least squares

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
27
Q

The lag-1 difference operator

A

∇Xt=Xt-Xt-1=(1-B)Xt

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
28
Q

Backward shift operator

A

BXt=Xt-1

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
29
Q

What is the problem of applying difference operator for a small sample size

A

Reduce sample size by 1 with each differencing

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
30
Q

∇^2Xt

A

Xt-2Xt-1+Xt-2

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
31
Q

Two methods to deal with Trend and Seasonality

A

Method 1: Trend and seasonality estimation a)Estimate the trend using a centered simple moving average b)Estimate the seasonal component c)(Optimal) Reestimate the trend of the deseasonalised data d)Estimate the noise using the (re)estimated trend and the estimated seasonal component. Method 2: Differencing

32
Q

What if seasonal effect d is odd

A

q=(d-1)/2

33
Q

What if seasonal effect d is even

A

q=d/2 but assign 0.5 weight to the first and last Xt

34
Q

∇dXt

A

Xt-Xt-в

35
Q

Is ∇d == ∇^d

A

No

36
Q

What is the problem if h is close to T in the sample ACF/ACVF

A

Estimator is not very reliable

37
Q

Rule of thumb for T and h in (sample ACF/ACVF)

A

T>50 and h

38
Q

For Large T the sample ACF of an iid sequence with finite variance are approximately

A

~N(0,1/T)

39
Q

CI for ACF

A

estimate +- 1.96/sqrt(T)

40
Q

H:0 and H:1 for Portmanteau Test

A

H:0 ρ(1)=ρ(2)=…=ρ(m)=0H:1 at least 1 ρ is not equal to 0

41
Q

How m is chosen for Portmanteau Test

A

Ad hoc

42
Q

Power of a test

A

P(alternative | alternative is true)

43
Q

How can test normality

A

Jarque-Bera testQ-Q plot

44
Q

How does Q-Q plot works

A

It plots an empirical quantiles against theoretical ones. In particular the values in the sample of data, in order from smallest to largest, are plotted against F-^-1 ((i-0.5)/T), i=1,2,…,T and А is the cumulative distribution function of the assumed distribution. If Assumed distribution is accurate then expect a 45 degree line through (0,0)

45
Q

For which processes do ACVF and ACF exist

A

Weakly Stationary

46
Q

For which processes do sample ACVF and ACF exist

A

Non weakly stationary

47
Q

What does convergence in mean square allows

A

for infinite linear combinations of past values

48
Q

Autocorrelation matrix

A

Rk:= (ρ(i-j))i,j=1,…,kSketch

49
Q

α(h) measures what

A

Left over correlation between Xt and Xt-h after accounting for their correlation with Xt-1, Xt-2,…,Xt-h+2,Xt-h+1

50
Q

Markov process

A

E(Xt | Xt-1,Xt-2,…) = E(Xt | Xt-1)

51
Q

Martingale process

A

E(Xt | Xt-1,Xt-2,…) = Xt-1

52
Q

Example of Markov process

A

AR(1)

53
Q

Example of Martingale process

A

RW without drift

54
Q

Define MA(1)

A

Xt=εt+θεt-1

55
Q

5 Properties of MA(1)

A
  1. Xt is weakly stationary2. γ(h) = (1+θ)^2σ^2 h=0 θσ^2 |h|=1 0 otherwise3. ρ(h) = 1 h=0 θ/(1+θ^2) |h|=1 0 otherwise4. PACF5. |θ|<1 X is invertible with respect to ε εt=(Xt-μt)-θ(Xt-1-μ)+θ^2(Xt-2-μ)-+… |θ|>=1 X is not invertible with respect to ε |θ|>1 X is invertible with respect to WN z:=(Xt-μt)-1/θ(Xt-1-μ)+1/θ^2(Xt-2-μ)-+…
56
Q

What is decaying and what cuts of for MA

A

ACF - DecaysPACF - Cuts off

57
Q

How confidence interval for an empirical ACF is called

A

Bartlett

58
Q

Write an equation for MA(inf)

A

Xt=εt+θ1εt-1+θ2εt-2+…

59
Q

What is the other name for MA(inf)

A

General linear process

60
Q

Define AR(1)

A

Xt=φXt-1+εt

61
Q

How stationarity property for MA is different from AR

A

In the AR it is explicitly postulated in the definition while in the MA is it not presupposed

62
Q

What happens to AR process if |φ|>1/=1/<1

A

<1 is weakly stationary>1 Can find the new white noise sequence for which Xt is stationary=1 Non stationary as is Random walk

63
Q

If all roots of the corresponding auxiliary equation lie inside the unit circle then the process is

A

Stationary

64
Q

MA(q) is invertible with respect to ε if and only if

A

Roots of the MA-polynomial lie outside of the unit circle

65
Q

How is this equation and its roots are called? ψ(x)=x^p-φ1x^p-1-φ2x^p-2-…-φ^p=0

A

Characteristic equation and characteristic roots

66
Q

What happens when the characteristic roots are complex numbers

A

Mixture of damping sine and cosine patterns

67
Q

What is decaying and what cuts of for AR

A

ACF - Cuts off PACF - Decays

68
Q

Define ARMA(1,1)

A

Xt=φXt-1+εt+θεt-1

69
Q

What are the restrictions on the φ and θ in the ARMA (1,1)

A

φ+θ≠0 and both φ and θ are real numbers

70
Q

What is the restriction imposed on the roots of AR and MA parts of the ARMA process

A

There can’t be any common roots for the AR and MA process

71
Q

When does ARMA have an infinite AR/MA representation

A

If the AR operator φ is stationary (i.e. all the roots of φ(x)=0 lie outside the unit circle) the ARMA process given by φ(B)Xt=θ(B)εt is indeed stationary and can be represented as an MA(inf) process. If the MA operator θ is invertible (i.e. all the roots of θ(x)=0 lie outside the unit circle) the ARMA process φ(B)Xt=θ(B)εt has an infinite AR representation.

72
Q

For ARMA(p,q) what does ACF mimics when q=>p after lag q-p

A

AR(p)

73
Q

For ARMA(p,q) what does PACF mimics when p=>q after lag p-q

A

MA(q)

74
Q

Which selection criteria is/are consistent and which asymptotically efficient

A

AIC,AICc are asymptotically efficientBIC consistent

75
Q

How do we choose a model based on AIC/BIC

A

Choose the one which has the lowest value