Models for Time Series Flashcards
white noise, moving average models, autoregressive models, mixed autoregressive moving average (ARMA) models, integrated models
White Noise
Definition
-time series {Xt} is called white noise if the Xt are i.i.d. with E(Xt)=0 for all t and variance is a finite constant:
Var(Xt) = σ² < ∞
-to check this in practice we would compare the residuals of our model with the residuals we expect for white noise
White Noise
Mean
μ(t) = E(Xt) = 0
White Noise
Autocovariance
γk = cov(Xt, Xt+k) = {σ², k=0 and 0, else
-since for white noise we model the Xt as independent
White Noise
Autocorrelation
ρk = {1, k=0 and 0, else
-since Xt are independent
White Noise
Remarks
1) often we assume Xt~N(0,σ²)
2) white noise is often used to model the residuals of more complicated time series
3) we usually denote a white noise process as {εt} with variance σε²
4) we can use correlogram to distinguish between white noise and processes with dependence
Bartlett’s Theorem
-if {Xt} is white noise, then for large n the distribution of ρk^, k≠0 is approximately N(0,1/n)
Identifying White Noise on a Correlogram
- if a time series is white noise, 95% of the lines on the correlogram should lie between 1.96/+√n and 1.96/-√n
- i.e. value |ρk^| > 1.96/√n are significant at the 5% level
Moving Average Model
Definition
-a stochastic process {Xt} is called a moving average process of order q (or an MA(q) process) if:
Xt = Σ βk * εt-k
-sum from k=0 to k=q
-where βo,…,βq ∈ R and {εt} is white noise
-i.e. Xt can be written as the weighted average of near past white noise
-near because we expect q to be small since more distant events are unlikely to have impact on more current events
Moving Average Model
Remarks
1) without loss of generality, we can assume βo=1 (since we can choose σε²=Var(εt)
2) since {εt} is stationary, {Xt} is stationary if q is finite (q
MA Process
Mean
μ(t) = E(Xt) = 0
MA Process
γo
γo = (βo²+…+βq²) σε²
MA Process
γk
γk = {Σ βi βi+k σε² , 0≤k≤q and 0, else
-sum between i=0 and i=q-k
MA(0)
-an MA(0) process is white noise
Moving Average Model
Finding β
- in practice, when applying a moving average model to data we don’t know what the β values should be
- to estimate the βs we can write β=β(ρ)
- since we can estimate ρ from the data, this allows us to estimate β as well
- if square root gives two values of β remember that since we don’t expect data to depends too much on the past we can choose the root that gives the smallest magnitude coefficient
Autoregressive Model
Definition
-a stochastic process {Xt} is an autoregressive process of order p, AR(p), if:
Xt = Σ αk*Xt-k + εt, for all t
-sum from k=1 to k=p
-where α1,…,αp∈R, {εt} is white noise and εt is independent of Xs for all s
Autoregressive Model
Remarks
- when constructing process {Xt} the first p values need to be specified as an initial condition
- we shall see that whether or not {Xt} is stationary depends on α1,…,αp and the initial conditions
Random Walk
-an AR(1) process:
Xt = α*Xt-1 + εt
Expectation of an AR(1) Process
Xt = α*Xt-1 + εt
- since E(εt)=0 :
E(Xt) = α E(Xt-1) = α^t E(X0)
Variance of an AR(1) Process
Xt = α*Xt-1 + εt Var(Xt) = α²Var(Xt-1) + σε²
When is a general AR(1) process weakly stationary?
Xt = α*Xt-1 + εt
- a general AR(1) process is weakly stationary when:
1) |α|<1
2) E(Xt) = 0 for all t
3) Var(Xt) = σε² / [1-α²] for all t including t=0
Stationarity of AR(p) Proposition
-an AR(p) process is stationary if and only if all the roots y1,…,yp of the equation:
α(y) = 1 - α1y - … - αpy^p
-are such that |yi|>1
The Backshift Operator
Definition
-the backshift operator, B, is defined by:
B Xt = Xt-1, for all t
Step 1 - AR(p) Process in Terms of the Backshift Operator
Xt = α1Xt-1 + … + αpXt-p + εt
= α1*B(Xt) + ... + α1*p*B^p(Xt) + εt => (1 - α1*B - ... - α1*p*B^p)Xt = εt => Xt = [1 - Σ αk*B^k]^(-1) εt -sum from k=1 to k=0 -apply a binomial expansion to the bracket
Step 2 - Define and Find ck
-we have Xt = [1 - Σ αk*B^k]^(-1) εt -let: α(B) = εt Σ ck*B^k -then Xt = α(B)^-1 εt -to get rid of the inverse, we shall find numbers ck such that 1/α(y) = Σ ck*y^k -for an AR(p) process, ck = A1/y1^k + ... + Ap/yp^l
Step 3 - assume Xt is weakly stationary
Var(Xt) = (Σ ck²) σε²
-for this to exist, i.e. be a finite constant, we need:
Σ ck² < ∞
Σ (A1/y1^k + … + Ap/yp^l)² < ∞
-an AR(p) process is stationary <=> |yi|>1 for all i=1,…,p
Stationarity of AR(2)
-for an AR(2) process, we have:
α(y) = 1 - α1y - α2y²
-roots given by the quadratic formula
-if α1²+4α2>0 we have two real roots
-if α1²+4α2<0, we have two complex roots
-if α2>-1 AND α2<1-α1 AND α2<1+α1 then the process is stationary
Stationarity of AR(p), p>2
-for AR(p) with p>2, use computer to find the roots
Autocovariance of AR(p)
γk = Σ αj * γk-j
- for all k≥1
- sum from j=1 to j=p
Autocorrelation of AR(p)
ρk = Σ αj * ρk-j
-sum from j=1 to j=p
Yule Walker Equations
-used to determine α1,α2,…,αp from ρ1,ρ2,…,ρp for AR(p) models
ρ1 = 1α1 + ρ1α2 + … + ρp-1αp
ρ2 = ρ1α1 + 1α2 + … + ρp-2αp
…
ρp = ρp-1α1 + ρp-2α2 + … + 1*αp
-from the data we can estimate the ρs and then use these equations to calculate the αs
AR(p) Model Fitting Steps
- to fit an AR(p) model to data {Xt}=(X1,…,Xn):
1) subtract the trend and the seasonal effects from {Xt} to obtain residuals {Yt}
2) estimate the acf of Y to obtain ρ1^,ρ2^,…,ρp^
3) solve Yule-Walker equations for α1^,…,αp^
4) consider the residuals Zt = Yt - α1^Yt-1 - … - αp^Yt-p use Bartlett bands to check if {Zt} are (approx.) white noise, else the model is not a good fit for the data
5) use sample variance of {Zt} to estimate σε²
6) add trend and seasonal effects back on to conclusions about {Yt} to get conclusions for {Xt}
ARMA Model
Definition
-an ARMA(p,q) model satisfies:
Xt = Σ αiXt-i + εt + Σ βjεt-j
-with εt independent of Xt-1,Xt-2,…
-sum from i=1 to i=p and j=1 to j=q
ARMA Model
In Terms of the Backshift Operator
α(B) Xt = β(B) εt -where: α(y) = 1 - Σ αi*y^i -sum from i=1 to i=p -and: β(y) = 1 + Σ βj*y^j -sum from j=1 to j=q
ARMA Model
Weak Stationarity
-as for AR(p), can write: Xt = α(B)^(-1) β(B) εt -weakly stationary if: Xt = Σ λk*β(B)*εt-k = Σ λk~*εt-k -with Σ λk~ < ∞ -sums from k=0 to k=∞ -equivalent to the roots of α() lying outside the complex unit circle
ARMA Model
Stationarity and Expectation
-if stationary, we have:
E(Xt) = Σ λk~*E(εt-k) = 0
-sum from k=0 to k=∞
ARMA Model
Reconstruct Noise as a Function of the Data and Invertibility
εt = β(B)^(-1) * α(B) * Xt = Σ δk*Xt-k
- sum from k=0 to k=∞
- if Σ δk² < ∞, then the process is invertible
- equivalently, {Xt} is invertible if the roots of β lie outside the complex unit circle
ARMA Model
Autocovariance
-autocovariance of an ARMA(p,q) process is given by:
γk = Σ αi * γk-i , k>q
-sum from i=0 to i=p
-recall that an AR(p) process satisfies the same relation but for all k>0, so AR and ARMA model show the same behaviour for k>q
Difference Operator
Definition
∇(Xt) = Xt - Xt-1 , for all t
-where ∇ is the difference operator
Difference Operator
Backshift Operator and Stationarity
∇ = 1 - B
-if {Xt} has stationary increments then ∇X is stationary
Difference Operator
Constant Mean and Linear Trend
-applying the difference operator to a series with constant mean removes the mean:
∇(Xt + μ) = Xt + μ - (Xt-1 + μ) = Xt - Xt-1 = ∇Xt
-applying the difference operator to a linear trend converts it to a constant mean:
∇(Xt + a + bt) = Xt + a + bt - (Xt-1 + a + bt) = Xt - Xt-1 + b
= ∇Xt + b
ARIMA Model
Definition
-autoregressive integrated moving average process
{Xt} is an ARIMA(p,d,q) process if ∇^d Xt is stationary ARMA(p,q) process
ARIMA Model
In Terms of ARMA Model
-an ARIMA(p,d,q) process {Xt} can be written as an ARMA(p+d,q) process which has a unit root and hence is non-stationary for d>0