Module 5: Time series Flashcards

1
Q

Weakly stationary requ.

A

Mean contant
Cov(Xt,Xt-h) only depends on h. (Ie time difference is all that matters).

Neither depend on time t.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

White noise Zt has

A

Zero mean, constant variance.

All observations are uncorrelated.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Important result for

Cov(aX+bY,cZ)

A

=Cov(aX+cZ) + Cov(bY,cZ)

=ac Cov(X,Z) + bc Cov(Y,Z)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What does B do

A

takes Zt back to Zt-1

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

AR(p) is

A

Auto regressive model of order p.
Xt=£1Xt-1+£2Xt-2…+Zt
until p terms

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

MA(q) is

A

Moving average order q

Xt=Zt+£1*Zt-1…….£qZt-q

£ are constants.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

ARMA(p,q) in words is

A

A model that is influenced by both past results and noise.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Auto-covariance function γ(τ)

A

Cov( X(t+τ) , X(t) )

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Auto Correlation Function ρ(τ)

A

γ(τ) / γ(0 = Corr( X(t+τ) , X(t) )

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

▽^j *Xt (j is superscript) Difference operator

A

(1-B)^j * Xt

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

▽^d *Xt (d is subscript) Lag-d

A

(1-B^d)* Xt

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

White noise is

A

A RV with mean 0, var σ^2

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

IID noise

A

RV with mean 0, var σ^2

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

IID is/is not white noise

A

Yes it is. IID is in the subset of white noise.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

What is a linear process

A

if Xt can be represented as

∑ νj * Z(t-j)
V are constants, with finite sum.
So this is a linear combo of past white noises.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Compact form of Linear process

A

ν(B)*Zt

where v(B) = ∑v*B^j ie the polynomial of B that moves them all back.

17
Q

An ARMA is causal if

A

X is expressable in terms of current and past Zt

Or, if roots to φ(B)=0 are outside unit circle.

18
Q

ARMA is invertable if

A

Zt is expressable in terms of current and past Xt

or if roots to θ(B)=0 are outside unit circle

19
Q

ACF of AR(p)

A

tails off

20
Q

ACF of MA(q)

A

cuts off after q

21
Q

2 methods of estimating parameters

A

MLE, MOM

22
Q

Easiest model selection method

A

Plot the SACF and SPACF

23
Q

What is Portmanteau stat

A

Function of the SACF of the residuals…distributed as chi squ.

24
Q

What is box-jenkins method of forcasting

A

Iteration method of getting forward results.