Time Series. Flashcards
Time series
A set of observations made at ordered time-points.
The index set
Time points at which the process is defined
State space
set of values that the random variables Xt may take
Aims of time series
- Draw inferences from time series 2. Examining the process generating the data 3. Forecasting
What is White Noise
- E(εt)=02. Cov(εt,εs) = σε iff t=0 else 0
Gaussian Stochastic process
All of its marginal distributions (any part of distribution) are Gaussian
What is Gaussian White Noise
If WN is also Gaussian
Are white noise shocks independent
White Noise shocks are only uncorrelated but not independent. Can have some dependents between absolute/squared shocks
Why is Gaussian WN is independent
Normally distributed random variables are uncorrelated if and only if they are independent
What is Random Walk
Xt=Xt-1+εt
Random walk with drift
Xt=a0+Xt-1+εt
Weak Stationarity
- Xt has finite moments of second order for all t in Z2. E(Xt) = E(Xs) for all t,s in R3. Cov(Xt, Xs) = Cov (Xt+h,Xs+h) for all h in NIn Words First/Second order process variables don’t depend on time
Strong stationarity
Joint distribution doesn’t depend on time For any consecutive m (from Z) and a lag h (from N) Xt1,…,Xtm and Xt1+h,…Xtm+h are identical
ACVF (AutoCoVariance Function)
γ(h)=Cov(Xt,Xt+h)
ACF (AutoCorrelation Function)
ρ(h)=Corr(Xt,Xt+h)
Properties of ACF and ACVF
- Positive semidefinite 2. Symmetric
What does it mean if an ACF or ACVF matrix is not positive semidefinite
Process is not stationary
What do ACF and ACVF measure
Degree of dependence among the values of a time series at different times
What is the general approach to Time Series Modelling
- Plot the series and examine the main features of the graph, checking in particular whether there is: a) trend b)seasonal component c)any apparent sharp changes in behaviour d)any outlying observations2. Remove the trend and seasonal components to get stationary residuals 3. Choose model to fit the residuals 4. Forecasting will be achieved by forecasting the residuals and the inverting any transformations to arrive at forecasts of the original series
Classical decomposition model
Xt=mt+st+Ytmt = trend functionst = seasonal componentYt=zero-mean random noise component
What are deterministic component/s (signal) and what are stochastic component/s in the Classical decomposition model
mt and st are signalYt is noise
What should do if Var increases
Apply preliminary transformations e.g. Log
If models with trend, but no seasonality (Xt = mt + Yt) how estimate trend
Method 1: Trend estimation:a)Nonparametric: 1. Moving Average 2. Exponential smoothing b)Model based:Fitting a polynomial trendMethod 2: Trend elimination by differencing
Constant Mean Model (CMM)
Xt = m + Yt where m is a constant. Big problem is that assigned weights are equal
What is the usual trade of between choosing large or small q
If mall - Faster reactionIf large - smaller variability
What method is used to estimate parameters in the polynomial trend model
Method of least squares
The lag-1 difference operator
∇Xt=Xt-Xt-1=(1-B)Xt
Backward shift operator
BXt=Xt-1
What is the problem of applying difference operator for a small sample size
Reduce sample size by 1 with each differencing
∇^2Xt
Xt-2Xt-1+Xt-2
Two methods to deal with Trend and Seasonality
Method 1: Trend and seasonality estimation a)Estimate the trend using a centered simple moving average b)Estimate the seasonal component c)(Optimal) Reestimate the trend of the deseasonalised data d)Estimate the noise using the (re)estimated trend and the estimated seasonal component. Method 2: Differencing
What if seasonal effect d is odd
q=(d-1)/2
What if seasonal effect d is even
q=d/2 but assign 0.5 weight to the first and last Xt
∇dXt
Xt-Xt-в
Is ∇d == ∇^d
No
What is the problem if h is close to T in the sample ACF/ACVF
Estimator is not very reliable
Rule of thumb for T and h in (sample ACF/ACVF)
T>50 and h
For Large T the sample ACF of an iid sequence with finite variance are approximately
~N(0,1/T)
CI for ACF
estimate +- 1.96/sqrt(T)
H:0 and H:1 for Portmanteau Test
H:0 ρ(1)=ρ(2)=…=ρ(m)=0H:1 at least 1 ρ is not equal to 0
How m is chosen for Portmanteau Test
Ad hoc
Power of a test
P(alternative | alternative is true)
How can test normality
Jarque-Bera testQ-Q plot
How does Q-Q plot works
It plots an empirical quantiles against theoretical ones. In particular the values in the sample of data, in order from smallest to largest, are plotted against F-^-1 ((i-0.5)/T), i=1,2,…,T and А is the cumulative distribution function of the assumed distribution. If Assumed distribution is accurate then expect a 45 degree line through (0,0)
For which processes do ACVF and ACF exist
Weakly Stationary
For which processes do sample ACVF and ACF exist
Non weakly stationary
What does convergence in mean square allows
for infinite linear combinations of past values
Autocorrelation matrix
Rk:= (ρ(i-j))i,j=1,…,kSketch
α(h) measures what
Left over correlation between Xt and Xt-h after accounting for their correlation with Xt-1, Xt-2,…,Xt-h+2,Xt-h+1
Markov process
E(Xt | Xt-1,Xt-2,…) = E(Xt | Xt-1)
Martingale process
E(Xt | Xt-1,Xt-2,…) = Xt-1
Example of Markov process
AR(1)
Example of Martingale process
RW without drift
Define MA(1)
Xt=εt+θεt-1
5 Properties of MA(1)
- Xt is weakly stationary2. γ(h) = (1+θ)^2σ^2 h=0 θσ^2 |h|=1 0 otherwise3. ρ(h) = 1 h=0 θ/(1+θ^2) |h|=1 0 otherwise4. PACF5. |θ|<1 X is invertible with respect to ε εt=(Xt-μt)-θ(Xt-1-μ)+θ^2(Xt-2-μ)-+… |θ|>=1 X is not invertible with respect to ε |θ|>1 X is invertible with respect to WN z:=(Xt-μt)-1/θ(Xt-1-μ)+1/θ^2(Xt-2-μ)-+…
What is decaying and what cuts of for MA
ACF - DecaysPACF - Cuts off
How confidence interval for an empirical ACF is called
Bartlett
Write an equation for MA(inf)
Xt=εt+θ1εt-1+θ2εt-2+…
What is the other name for MA(inf)
General linear process
Define AR(1)
Xt=φXt-1+εt
How stationarity property for MA is different from AR
In the AR it is explicitly postulated in the definition while in the MA is it not presupposed
What happens to AR process if |φ|>1/=1/<1
<1 is weakly stationary>1 Can find the new white noise sequence for which Xt is stationary=1 Non stationary as is Random walk
If all roots of the corresponding auxiliary equation lie inside the unit circle then the process is
Stationary
MA(q) is invertible with respect to ε if and only if
Roots of the MA-polynomial lie outside of the unit circle
How is this equation and its roots are called? ψ(x)=x^p-φ1x^p-1-φ2x^p-2-…-φ^p=0
Characteristic equation and characteristic roots
What happens when the characteristic roots are complex numbers
Mixture of damping sine and cosine patterns
What is decaying and what cuts of for AR
ACF - Cuts off PACF - Decays
Define ARMA(1,1)
Xt=φXt-1+εt+θεt-1
What are the restrictions on the φ and θ in the ARMA (1,1)
φ+θ≠0 and both φ and θ are real numbers
What is the restriction imposed on the roots of AR and MA parts of the ARMA process
There can’t be any common roots for the AR and MA process
When does ARMA have an infinite AR/MA representation
If the AR operator φ is stationary (i.e. all the roots of φ(x)=0 lie outside the unit circle) the ARMA process given by φ(B)Xt=θ(B)εt is indeed stationary and can be represented as an MA(inf) process. If the MA operator θ is invertible (i.e. all the roots of θ(x)=0 lie outside the unit circle) the ARMA process φ(B)Xt=θ(B)εt has an infinite AR representation.
For ARMA(p,q) what does ACF mimics when q=>p after lag q-p
AR(p)
For ARMA(p,q) what does PACF mimics when p=>q after lag p-q
MA(q)
Which selection criteria is/are consistent and which asymptotically efficient
AIC,AICc are asymptotically efficientBIC consistent
How do we choose a model based on AIC/BIC
Choose the one which has the lowest value