Model Fitting Flashcards
model selection, parameter estimation (method of moments, least squares, MLE), MOM and invertibility for MA
Model Selection -ρk and ACF
MA(q)
ρk = 0 for all k>0
-so the sticks drop to 0 after some lagq
Model Selection - ρk
AR(p)
ρk = α1ρk-1 + … + αpρk-p
Model Selection - ρk
ARMA(p,q)
-as for AR(q) except for the first q values
Model Selection - ACF
ARIMA
ρk decays very very slowly and we need to apply differencing until ρk indicates AR, MA, ARMA
Spotting AR Processes Lemma
-let y1,…,yp be roots of α(y) = 1 - α1y - … - αpy^p
-then for AR(p):
ρk = Σ cj/yj^k
-sum from j=1 to j=p
-for suitable constants c1,…,cp
AR(1), ACF
ρk = α^k
- for α>0, exponential decay
- for α<0, exponential decay with alternating signs
AR(P) p>1, ACF
ρk = c^k cos(φk)
- damped oscillations for some constants φ, c with |c|<1
- looks like sine/cos wave of sticks with peak amplitude decreasing exponentially
Partial Autocorrelation Function
Definition
- for k=1,2,… (however many you think are necessary) use the Yule-Walker equations to fit an AR(k) model (using ρ1,…,ρk) obtaining coefficients αk1,…,αkk
- then αkk = lag-k partial autocorrelation function (pacf) of {Xt}
AR(p) - PACF
-cuts off after lag p
MA(q) - PACF
-exponential decay / damped oscillations
ARMA(p,q) - PACF
-exponential decay / damped oscillations
MA(q), ρk^ Distribution
-if X1,...,Xn is MA(q), then for k>q: ρk^ ~ N(0, 1/n [1 + 2Σρi^²]) -sum from i=1 to i=q => 95% of sticks after lag q fall in range: [-1.96 * sd , 1.96*sd]
AR(p), αkk^ Distribution
-if X1,...,Xn is AR(p), then for k>p: αkk^ ~ N(0, 1/n) => 95% of sticks after lag p fall in range: [-1.96 * 1/√n , 1.96* 1/√n]
Parameter Estimation - Non-Zero Mean
-so far we have assumed that stationary processes have zero mean
-we can cope with non-zero mean μ for {Xt} by modelling:
Yt = Xt - μ
-we estimate μ by μ^ = 1/n Σ Xt
-sum form t=1 to t=n
-subtract μ^ from observations to estimate {Yt} and proceed with fitting model to {Yt}
Method of Moments
Definition
- choose parameters such that means and correlation for model and sample coincide
- for AR(p) models this gives the Yule-Walker equations
Method of Moments
Steps
-can estimate σε² via method of moments assuming {Xt} is stationary
-write Xt in terms of the model, e.g. for AR(1):
Xt = α1Xt-1 + εt
-take variance of Xt
σx² = Var(Xt) = Var(α1Xt-1 + εt)
-and write in terms of σε²
-rewrite in terms of σε² for an estimate of σε²
-in the estimate, σx² can be replaced by the sample variance
Least Square Estimation
Definition
-choose parameters to minimise the residual sum of squares
Least Squares Estimation
Steps
-rearrange model to write εt = …
-calculate the residual sum of squares:
S(α) = Σ εt²
-find α to minimise S(α) using ∂S(α)/∂α = 0
-can then estimate σε² using the sample variance equation:
σε²^ = 1/(n-1) Σ εt²
-sum form t=1 to t=n
Likelihood
Definition
L(α; X) = f(X; α)
Maximum Likelihood Estimation
Definition
-to use MLE, need to assume distribution for εt, then maximise L(α|X) = f(X; α)
Invertibility
Definition
-the past influence decays exponentially so that more distant events have a smaller effect on the present than more recent past
Invertibility for MA(1) Processes
-for an MA(1) process: Xt = `β εt-1 + εt -with εt ~ N(0,σε²) -equivalently: Xt = (1 + βB) εt => Xt = εt + βXt-1 - β²Xt-2 + β³Xt-3 + .... -if |β|>1 the distant past has more influence than the recent -so for invertibility we need |β|<1
Invertibility for MA(q) Processes
-an MA(q) process is invertible if all roots of β(y) are outside the unit circle
-where:
β(y) = 1 + β1y + β2y² + …