Model Fitting Flashcards

model selection, parameter estimation (method of moments, least squares, MLE), MOM and invertibility for MA

1
Q

Model Selection -ρk and ACF

MA(q)

A

ρk = 0 for all k>0

-so the sticks drop to 0 after some lagq

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Model Selection - ρk

AR(p)

A

ρk = α1ρk-1 + … + αpρk-p

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Model Selection - ρk

ARMA(p,q)

A

-as for AR(q) except for the first q values

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Model Selection - ACF

ARIMA

A

ρk decays very very slowly and we need to apply differencing until ρk indicates AR, MA, ARMA

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Spotting AR Processes Lemma

A

-let y1,…,yp be roots of α(y) = 1 - α1y - … - αpy^p
-then for AR(p):
ρk = Σ cj/yj^k
-sum from j=1 to j=p
-for suitable constants c1,…,cp

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

AR(1), ACF

A

ρk = α^k

  • for α>0, exponential decay
  • for α<0, exponential decay with alternating signs
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

AR(P) p>1, ACF

A

ρk = c^k cos(φk)

  • damped oscillations for some constants φ, c with |c|<1
  • looks like sine/cos wave of sticks with peak amplitude decreasing exponentially
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Partial Autocorrelation Function

Definition

A
  • for k=1,2,… (however many you think are necessary) use the Yule-Walker equations to fit an AR(k) model (using ρ1,…,ρk) obtaining coefficients αk1,…,αkk
  • then αkk = lag-k partial autocorrelation function (pacf) of {Xt}
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

AR(p) - PACF

A

-cuts off after lag p

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

MA(q) - PACF

A

-exponential decay / damped oscillations

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

ARMA(p,q) - PACF

A

-exponential decay / damped oscillations

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

MA(q), ρk^ Distribution

A
-if X1,...,Xn is MA(q), then for k>q:
ρk^ ~ N(0, 1/n [1 + 2Σρi^²])
-sum from i=1 to i=q
=>
95% of sticks after lag q fall in range:
[-1.96 * sd , 1.96*sd]
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

AR(p), αkk^ Distribution

A
-if X1,...,Xn is AR(p), then for k>p:
αkk^ ~ N(0, 1/n)
=>
95% of sticks after lag p fall in range:
[-1.96 * 1/√n , 1.96* 1/√n]
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Parameter Estimation - Non-Zero Mean

A

-so far we have assumed that stationary processes have zero mean
-we can cope with non-zero mean μ for {Xt} by modelling:
Yt = Xt - μ
-we estimate μ by μ^ = 1/n Σ Xt
-sum form t=1 to t=n
-subtract μ^ from observations to estimate {Yt} and proceed with fitting model to {Yt}

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Method of Moments

Definition

A
  • choose parameters such that means and correlation for model and sample coincide
  • for AR(p) models this gives the Yule-Walker equations
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Method of Moments

Steps

A

-can estimate σε² via method of moments assuming {Xt} is stationary
-write Xt in terms of the model, e.g. for AR(1):
Xt = α1Xt-1 + εt
-take variance of Xt
σx² = Var(Xt) = Var(α1
Xt-1 + εt)
-and write in terms of σε²
-rewrite in terms of σε² for an estimate of σε²
-in the estimate, σx² can be replaced by the sample variance

17
Q

Least Square Estimation

Definition

A

-choose parameters to minimise the residual sum of squares

18
Q

Least Squares Estimation

Steps

A

-rearrange model to write εt = …
-calculate the residual sum of squares:
S(α) = Σ εt²
-find α to minimise S(α) using ∂S(α)/∂α = 0
-can then estimate σε² using the sample variance equation:
σε²^ = 1/(n-1) Σ εt²
-sum form t=1 to t=n

19
Q

Likelihood

Definition

A

L(α; X) = f(X; α)

20
Q

Maximum Likelihood Estimation

Definition

A

-to use MLE, need to assume distribution for εt, then maximise L(α|X) = f(X; α)

21
Q

Invertibility

Definition

A

-the past influence decays exponentially so that more distant events have a smaller effect on the present than more recent past

22
Q

Invertibility for MA(1) Processes

A
-for an MA(1) process:
Xt = `β εt-1 + εt
-with εt ~ N(0,σε²)
-equivalently:
Xt = (1 + βB) εt
=> 
Xt = εt + βXt-1 - β²Xt-2 + β³Xt-3 + ....
-if |β|>1 the distant past has more influence than the recent
-so for invertibility we need |β|<1
23
Q

Invertibility for MA(q) Processes

A

-an MA(q) process is invertible if all roots of β(y) are outside the unit circle
-where:
β(y) = 1 + β1y + β2y² + …