Time Series Analysis Flashcards

1
Q

Define ACVF.

A

The autocovariance function (ACVF) of {X_t}t at lag h is defined as:
gamma_X(h)=Cov(X_t,X
{t+h}).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Define ACF.

A

The autocorrelation function (ACF) of {X_t}t at lag h is defined as:
rho_X(h)=gamma_X(h)/gamma_X(0), where gamma_X(h)=Cov(X_t,X
{t+h}) is the ACVF of {X_t}_t.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Is X_t=a+bZ_t+Z_{t-2} stationary?
Calculate the mean and the ACVF of X_t=a+b
Z_t+Z_{t-2}, where {Z_t}_t is an iid N(0,sig^2) sequence and a, b in R.
Is {X_t}_t strictly stationary?

A

S1E1a

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Define stationary.

A

A TS {X_t}_t is (weakly) stationary if:

i) E[X_t^2] finite
ii) mü_X(t) does not depend on t,
iii) gamma_X(t,t+h) does not depend on t (we then simply write gamma_x(h) )

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Define strictly stationary.

A

A TS {T_t}t is strictly stationary if for all n in N and h in Z:
(X_1,…,X_n) =^d (X
{1+h},…,X_{n+h}).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Is X_t=Z_tZ_{t-1} stationary?
Calculate the mean and the ACVF of X_t=Z_t
Z_{t-1}, where {Z_t}_t is an iid N(0,sig^2) sequence and “t0 some integer”.
Is {X_t}_t strictly stationary?

A

S1E1d

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Is X_t=Z_t if t leq t0 and X_t=Z_t+Z_{t-1} otherwise, where {Z_t}_t is an iid N(0,sig^2) sequence and t0 some integer stationary?
Calculate the mean and the ACVF of
Is {X_t}_t strictly stationary?

A

S1E1e

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Define an MA(q) process.

A

{X_t}t ~ MA(q) if:
X_t=Z_t+theta_1Z_{t-1}+…+theta_qZ
{t-q} for {Z_t}_t~WN(0,sig^2) and theta_1,…,theta_q in R.

BTW: MA(q)=moving average process of order q>=1.
We write

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Define WN.

Define i.i.d. noise.

A

{X_t}_t is white noise if it is a sequence of uncorrelated and centered r.v.s s.t. E[X_t]=0, Cov(X_r,X_s)=0 (s!=r). (@: Finite variance)

{X_t}_t is i.i.d. noise if it is a sequence of independent and identically distributed r.v.s s.t. E[X_t]=0 for all t.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Let {X_t}t be the MA(2) process: X_t=Z_t+theta*Z{t-2}, where {Z_t}_t is WN(0,sig^2).

Compute the mean, the ACVF and ACF for this time series.

A

S1E2a

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Let {X_t}t be the MA(2) process: X_t=Z_t+theta*Z{t-2}, where {Z_t}_t is WN(0,sig^2).

Compute the variance of the sample mean (X_1+X_2+X_3+X_4)/4 for theta=0.8.
Do the same for theta=-0.8. Note the difference.

A

S1E2b

S1E2c

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

What is k(h):=1_{h=0} for h in Z an ACF of?

A

{Z_t}_t~WN(0,sig^2)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

What is the ACVF of an MA(1) process?

A

gamma_X(h)= sig^2(1+theta^2)1{h=0} + sig^2theta1{|h|=1}

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

What is the ACF of an MA(1) process?

A

rho(h)=1{h=0}+theta/(1+theta^2)*1{|h|=1}

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Define linear process.

A

A TS {X_t}t is said to be a linear process if it has the following representation:
X_t=Sum(psi_j*Z
{t-j} ; j=-inf,…,inf), where {Z_t}t~WN(0,sig^2) and {psi}_j is an absolutely convergent real sequence i.e. Sum( |psi_j| ; j=-inf,…,inf ) is finite.

BTW: the last condition ensures that X_t is finite a.s.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Define an AR(p) process.

A

{X_t}t~A(p) iff {X_t}t is stationary and X_t-phi_1*X{t-1}-…-phi_p*X{t-p}=Z_t with {Z_t}_t~WN(0,sig^2) and phi_1,…,phi_p in R.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

Define an ARMA(p,q) process.

A

A time series {X_t}t is an ARMA(p,q) process if it is stationary and satisfies the equations:
phi(B)X_t=X_t-phi
{t-1}X_{t-1}-…-phi_{t-p}X_{t-p}=Z_t+theta_{t-1}Z_{t-1}+…+theta_{t-q}Z_{t-q}=theta(B)Z_t, where {Z_t}_t~WN(0,sig^2), phi_p!=0!=theta_q, and the polynomials (w. real coeffs) phi and theta have no common factors/roots.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

Define causality for linear processes and ARMA(p,q) models

A

A linear process Cov(X_t,X_s)=0 for all s greater than t, then a model of {X_t}_t is called causal.
(This implies the same definition as for ARMA(p,q) models)

{X_t}_t~ARMA(p,q) is causal if there is an absolutely summable sequence {psi_j}j s.t.
X_t=Sum[psi_j*Z
{t-j} ; j=0,…,inf ] for all t.

BTW: Was defined after AR(1), MA(q) and linear processes but before ARMA(p,q) processes.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

Define invertibility of ARMA(p,q) models.

A
An ARMA(p,q) process {X_t}_t is invertible if there exists an absolutely summable sequence {pi_j}_j s.t.
Z_t=Sum[ pi_j*X_{t-j} ; t=0,...,inf ] for all t in Z.

An ARMA(1,1) model is said to be invertible if Z_t can be expressed with current and past values of {X_t}_t

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
20
Q

Characterize causality for {X_t}_t~ARMA(p,q).

A

Causality is equivalent to the condition:

phi(z)=1-phi_1z-…-phi_pz^p for all |z| leq 1.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
21
Q

Characterize invertibility for {X_t}_t~ARMA(p,q).

A

Invertibility is equivalent to the condition that:

theta(z)=1+theta_1z+…+theta_qz^q != 0 for all |z| leq 1.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
22
Q

Is X_t=Z_1cos(ct)+Z_2sin(ct) stationary?
Calculate the mean and the ACVF of X_t=Z_1cos(ct)+Z_2sin(ct), where {Z_t}_t is an iid N(0,sig^2) sequence.
Is {X_t}_t strictly stationary?

A

S1E1b

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
23
Q

State the trig identities cos(x-y) and sin(x-y)

A

cos(a+/-b)=cos(a)cos(b)-/+sin(a)sin(b)

sin(a+/-b)=sin(a)cos(b)+/-cos(a)sin(b)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
24
Q

Is X_t=Z_tcos(ct)+Z_{t-1}sin(ct) stationary?
Calculate the mean and the ACVF of X_t=Z_tcos(ct)+Z_{t-1}sin(ct), where {Z_t}_t is an iid N(0,sig^2) sequence.
Is {X_t}_t strictly stationary?

A

S1E1c

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
25
Q

Show that k(h):=(-1)^{|h|} for h in Z is the ACF of a stationary time series.

A

S2E1a

X_t=(-1)^t*X; X mean0, var1

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
26
Q

Show that k(h):=1+cos(pih/2)+cos(pih/4) for h in Z is the ACF of a stationary time series.

A

S2E1b

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
27
Q

Show that k(h):=1{h=0}+0.4*1{|h|=1} for h in Z is the ACF of a stationary time series.

A

S2E1c

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
28
Q

(State a necessary condition if there is one)
What is the ACVF of an AR(1) process?
What is the ACF of an AR(1) process?

A

If |phi| less than 1 (for |phi| greater than 1 not sure), then:
K(h)=sig^2/(1-phi^2)*phi^{|h|} for h in Z
rho=phi^{|h|} for h in Z

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
29
Q

Define q-correlated.

A

A time series {X_t}_t is said to be q-correlated if it is stationary and gamma_x(h)=0 for all h s.t. |h| larger than q.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
30
Q

Let {X_t}_t be a causal AR(1) process satisfying X_t=phiX_{t-1}+Z_t for t in Z with phi in (-1,1) and {Z_t}_t~WN(0,sig_z^2). Consider {Y_t}_t defined by the equations Y_t=X_t+W_t, where {W_t}_t~WN(0,sig_w^2) s.t. Cov(Z_t,W_s)=E[Z_tW_s}=0 for all t, s in Z.

Show that the time series is stationary and find its ACVF

A

S2E2a

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
31
Q

Define sample AVCF.

A

Let X_1,…,X_n be n obvservations of a time series.
The sample ACVF is:
gamma^(h)=1/nSum[ (X_{t+|h|}-bar(X))(X_t-bar(X)) ; t=1,…n-|h| ].

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
32
Q

Let {X_t}_t be a causal AR(1) process satisfying X_t=phiX_{t-1}+Z_t for t in Z with phi in (-1,1) and {Z_t}_t~WN(0,sig_z^2). Consider {Y_t}_t defined by the equations Y_t=X_t+W_t, where {W_t}_t~WN(0,sig_w^2) s.t. Cov(Z_t,W_s)=E[Z_tW_s}=0 for all t, s in Z.

Show that {U_t}t defined as U_t=Y_t-phi*Y{t-1} is 1-correlated and deduce that it is an MA(1).

Hint: gamma_Y(h)=(sig_W)^21{h=0}+(sig_Z)^2phi^{|h|}/(1-phi^2)

A

S2E2b

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
33
Q

Define sample ACF.

A

Let X_1,…,X_n be n obvservations of a time series.
The sample ACF is:
rho^(h)=gamma^(h)/gamma^(0), where gamma^ is the sample ACF.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
34
Q

State the density of multivariate Gaussian distributions.

A

f(x)=(2pi)^{-k/2}det(Sig)^{-1/2)exp( - 1/2(x-mü)^TSig^{-1}(x-mü))

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
35
Q

Define a causal AR(1) with mean mü.

A

A TS {X_t}t is a causal AR(1) with mean mü if X_t satisfies:
X_t - mü = phi*(X
{t-1} - mü ) + Z_t for t in Z, phi in (-1,1) and {Z_t}_t~WN(0,sig^2).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
36
Q

State the asymptotic result relating to ARMA and its centered sum.

A

If {X_t}_t is an ARMA time series, it can be shown that sqrt(n)*(bar(X)_n - mü ) converges in distribution to N(0,Sum( gamma_X(h) ; j=-inf,…,inf ).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
37
Q

Suppose X_n converges in distribution to N(mü,sig^2), how does one get a confidence interval for mü?

A

It follows that:
(X_n-mü)/sig converges in distribution to N(0,1)
{
If follows that:
lim_n P( (X_n-mü)/sig leq t)=Phi(t)
If follows that:
lim_n P( t0 leq (X_n-mü)/sig leq t1)=Phi(t1)-Phi(t0)
}
If follows that:
lim_n P( Phi^{-1}(1-alpha) leq (X_n-mü)/sig leq Phi^{-1}(1-alpha) ) converges to 1-alpha
or equivalently
lim_n P( sigPhi^{-1}(1-alpha)-X_n leq -mü leq sigPhi^{-1}(1-alpha)-X_n ) converges to 1-alpha
From which it follows that:
The probability of mü being in:
[X_n-sigPhi^{-1}(1-alpha), X_n+sigPhi^{-1}(1-alpha)]
is approximately 1-alpha for large n.

If necessary: Check with: S2E3 once next ex is complete.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
38
Q

Let {X_t}_t be a causal AR(1) process with mean mü. Based on n=100 observations from such a model with phi=0.6, sig^2=2 and unkown mü, we compute bar(X)_n=0.271. Construct an asymptotic 95% confidence interval for mü and indicate the result you are using.
Do you think that the observed data is compatible with the hypothesis that mü=0?

A

S2E3

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
39
Q

State Bartlett’s formula and the context of its use (state the asymptotic).

A

S3E1a

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
40
Q

Apply Bartlett’s formula to WN(0,sig^2).

Also use it to calculate W_{ii} of an MA(1) process {X_t}_t with paramter rho=rho_X(1).

A

S3E1b

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
41
Q

Define PACF of an ARMA process.

Sample PACF?

A

Let {X_t}t be an ARMA process. Its PACF is the function:
alpha: h in {0,1,2,…} to alpha(h), where
alpha(0)=1 and alpha(h)=phi
{hh} else,
where phi_{hh} is the last component of the vector phi_h=Gamma_h^{-1}*gamma_h with Gamma_h=[gamma_X(i-j); i,j=1,…,h] and gamma_h=(gamma_X(1),…,gamma_X(h))^T.
NOTE: Gamma_h is assumed to be invertible
For sample PACF change underlying gamma_x to gamma^_h
Source: S3E2a (or L13S10)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
42
Q

Let {X_t}_t~MA(1).

Show that alpha(2)=-theta^2/(theta^4+theta^2+1), where alpha=PACF.

A

S3E2b

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
43
Q

Let {X_t}t~MA(1) i.e. X_t=Z_t+theta*Z{t-1}.
For which regions of theta does the innovations algorithm converge?
|theta| smaller than 1
|theta| = 0
|theta| larger than 1

A

|theta| smaller than 1: ???check skript and update
|theta| = 0: think or check skript ???
|theta| larger than 1: No according to S3E3c

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
44
Q

Define spectral density.

When is some function a spectral density of a stationary time series?

A

Let {X_t}t be some 0-mean stationary time series with ACVF gamma=gamma_X satisfying:
Sum[ |gamma(h)| ; h
-inf,…,inf ] is finite.
The spectral density of {X_t}_t is the function:
f(lamba)=1/(2pi)Sum[ exp(-ihlambda)*gamma(h) ; h=-inf,…,inf ].

NOTE:

  • We can restrict the domain to [-pi,pi] since this is periodic
  • Is well defined since Sum[ |gamma(h)| ; h in Z] is finite

A function f is THE spectral density of some stationary time series {X_t}_t with ACVF gamma_X=gamma if:

i) f(lambda) geq 0 for all lambda in [-pi,pi]
ii) gamma(h) = Int[ exp(ihlambda) * f(lambda) ; lambda in [-pi,pi] ]

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
45
Q

Define spectral distribution function.

A

The generalized distribution function F (i.e. F(lambda)/F(pi) is a distribution function) s.t.
gamma(h)=Int[ exp(ihlambda) ; dF(lambda) for lambda in [-pi,pi] ].
(Riemann-Stieltjes integral:
Int[ f ; dg(x) for x in [a,b] ]=lim_n Sum[ f(x_i)*( g(x_{i+1}) - g(x_i) ) ; for i=1,..,n ] )

46
Q

How does one test if a sample of observations {Y_1,…,Y_n} is sampled from i.i.d. noise?
(ACF version)

A

Assuming they are, then for diverging n:
sqrt(n)*(rho^(1),…,rho^(h))^T converges to N(O,Id_h) for any fixed h in N.

N_h:=Sum( 1{|rho^(i)| geq z{1-alpha/2}/sqrt(n) ; i=1,…,h ),
where z_{1-alpha/2}=Phi^{-1}(1-alpha/2)=(1-alpha/2)-quantile of N(0,1) with Phi(x)=1/sqrt(2pi)Int[exp(-t^2/2); -inf to x]
behaves approx. like Bin(h,alpha) in the limit.

P(|rho^(1)| geq z_{1-alpha/2} ) =
P(sqrt(n)|rho^(1)| geq z_{1-alpha/2} ) converges to P(|Z| geq z_{1-alpha/2})=alpha, where Z=a N(0,1)-variable.
Hence, E[N_h] approx. = h
alpha and Var(N_h) approx. = halpha(1-alpha).

47
Q

Define the lag-1 distance operator.

A

nabla=1-B

48
Q

How does one test if a sample of observations {Y_1,…,Y_n} is sampled from i.i.d. noise?
(Portmonteau test)

A

Q:=n*Sum[ rho^(j)^2 ; j=1,…,h ]
Under H0,
Q converges to chi^2(h).
Let x=chi^2_{1-alpha}(h)=(1-alpha)-quantile of chi^2(h), then we reject the null hypothesis if Q larger than x

49
Q

How does one test if a sample of observations {Y_1,…,Y_n} is sampled from i.i.d. noise?
(Difference sign test)

A

This test is based on the total count S of values i=2,…,n s.t. Y_i greater than (g.t.) Y_{i-1}:
S=Sum[ 1{Y_i g.t. Y{i-1} ; i=2,…,n ]
As n diverges, it can be shown that:
(S-(n-1)/2)/sqrt( (n+1)/12 ) convges in distr. to N(0,1).

50
Q

How does one test if a sample of observations {Y_1,…,Y_n} is sampled from i.i.d. Gaussian noise?
(normality)

A

H0: Y_1,…,Y_n~ iid N(0,sig^2)
Let Y_{(1)},…,Y_{(n)} be the ordered statistics of Y_1,…,Y_n.
If X_{(1)},…,X_{(n)} are the order statistics of X_1,…,X_n ~ iid N(0,1), then:
E[Y_{(i)}]=sigE[X_{(i)}=:sigm_i for i in {1,…,n}
Under H0, the graph (m_1,Y_{(1)},…,m_n,Y_{(n)}) should look approx. linear (slope=sig).
For i=1,…,n, m_i approx.= Phi^{-1}((i-0.5)/n)=:z_i.
Define the squared correlation:
R^2 = [ Sum[ (Y_i-bar(Y))z_i ] / sqrt[Sum[(Y_i-bar(Y))^2Sum[z_i^2]] ]^2
Under H0, R^2 should be close to 1.
We reject H0 if R^2 leq r_{n,1-alpha}^2=(1-alpha)-quantile of R^2 (R^2=R^2(n))

51
Q

What is the best linear prediction of X_{n+h} given X_n in an L^2-sense i.e. a^ and b^ s.t.
E[ ( X_{n+h}-a^ * X_n - b^ )^2 ]=min_{a,b} E[ ( X_{n+h}-a * X_n - b )^2 ] ?
Notation: P(X_{n+h}|X_n)

When is E[X_{n+h}|X_n]=P(X_{n+h}|X_n)?

A

a^ = rho_X(h), b^ = mü(1-rho_X(h))
P(X_{n+h} | X_n) = mü + rho_X(h)
(X_n - mü)

If {X_t}t “is a Gaussian time series”, then E[X{n+h}|X_n]=P(X_{n+h}|X_n).

52
Q

State the properties of gamma_X(h).

A
  • gamma_X(0) geq 0
  • |gamma_X(h)| leq gamma_X(0)
  • gamma_X(h)=gamma_X(-h) (even)
53
Q

Define non negative definiteness of a real-valued function K on Z.

A

If for all n in N and a=(a_1,…,a_n)^T in R^n:
Sum[ a_ia_jK(i-j) ; i,j=1,…,n ] geq 0

BTW: In the exercises they work with the matrix vector formulation anyway

54
Q

State the characterisation of ACVFs.

A

A real-valued function K on Z is the ACVF of some stationary time series if and only if:

  • K is even
  • K is non-negative definite
55
Q

If Z~N(Mü,Sig), C in R^{m x n}, d in R^m,

then what is the distribution of C*Z+d ?

A

CZ+d~N(CMü+d,CSigC^T)

BTW: CSigC^T pos. def. iff full rank=m

56
Q

Is K(h)=1{h=0}+beta*1{|h|=1} an ACVF of a stationary TS?

A

Iff |beta| leq 1/2

57
Q

Is K(h)=cos(c*h) an ACVF of a stationary TS?

A

Yes of X_t=Acos(ct)+Bsin(ct) with E[A]=E[B]=0, E[A^2]=E[B^2]=1, Cov(A,B)=0.

  • E[X_t]=0
  • Cov(X_{t+h},X_t)=gamma_X(h)=K(h) (finite)
58
Q

State the properties of strictly stationary time series {X_t}_t.

A

a) X_t are identically distributed
b) (X_t,X_{t+h}) =^d (X_1,X_{1+h}) for all t (and fixed h)
c) {X_t}_t is weakly stationary provided that E[X_t^2] is finite
d) Weak stationarity does NOT imply strict stationarity
e) An iid sequence is strictly stationary

59
Q

How does one construct a strictly stationary time series from some iid sequence {Z_t}_t?

A

Take some measurable function g on R^{q+1}, geq 1 and define
X_t=g(Z_t,…,Z_{t-q})
Then:
(X_{1+h},…,X_{n+h})=^d (g(Z_{1+h},…,g(Z_{1+h-q},…,Z_{n+h},…,Z_{n+h-q}))=^d (X_1,…,X_n)

60
Q

Define q-dependent

A

Alternatively: A TS {X_t}_t is q-dependent if:

|s-t| greater than q for s,t in Z implies X_t and X_s are independent.

61
Q
  • q-dependence implies q-correlated, provided:______.
  • An iid sequence is_______
  • A WN(0,sig^2) sequence is________
  • An MA(q) is ________.
  • (X_t)_t q-correlated implies_________.
A
  • E[X_t^2] is finite
  • 0-dependent
  • 0-correlated
  • q-correlated
  • that (X_t)_t~MA(1)
62
Q

State the theorem, of how one stationary time series {Y_t}_t gives rise to another one {X_t}_t and state the expectation and ACVF of {X_t}_t

A

If {Y_t}t is a stationary time series with E[Y_t]=0 and ACVF=gamma_Y, and {psi_j}{j in Z} is a real absolutely convergent sequence, then
X_t=Sum[ psi_j * Y_{t-j} ; j=-inf,…,inf ] is a stationary time series with E[X_t]=0 and
gamma_X(h)=Sum[ Sum[ psi_jpsi_kgamma_Y(h+k-j) ; h=-inf,..,inf ] ; j=-inf,..,inf ]

BTW: If {Y_t}t~WN(0,sig^2) then gamma_X(h)=sig^2*Sum[ psi_j * psi{j+h} ; j=-inf,..,inf]

63
Q

State the interesting properties of AR(1) depending on phi (regarding its linear process representaiton).

A

If |phi| less than 1, then {X_t}t is causal:
X_t=Sum[ phi^j * Z
{t-j} ; j=0,…,inf ],
so {X_t}_t~MA(inf)
If |phi| more than 1, then {X_t}_t
is non-causal, hence not ~MA(q) for any q.
If |phi|=1, then there exist no stationary solutions

BTW: If |phi| greater than 1 case:
X_t=-Sum[phi^{-j}*Z_{t+j} ; j=0,…,inf ]

64
Q

State the properties of an {X_t}_t~ARMA(1,1) depending on phi and theta concerning stationarity.

A
If |phi| less than 1, then:
Exists a (a.s.) unique stationary solution, which is causal
( X_t=Z_t+(theta+phi)*Sum[ phi^{j-1}*Z_{t-j} ; j=1,...,inf ]  )

If |phi|=1 or theta not in R, then:
There no stationary solution exists.

If |phi| greater than 1, then:
Exists a (a.s.) unique stationary solution, which is non-causal
( X_t= - theta/phi*Z_t - (theta+phi)*Sum[ phi^{-j-1} * Z_{t+j} ; j=1,...,inf ] )

If |theta| less than 1, then:
Exists a (a.s.) unique invertible solution
( Z_t=X_t-(theta+phi)Sum[ (-theta)^{j-1}X_{t-j} ; j=1,…,inf] )
If |theta| greater than 1, then:
Exists a(n a.s.) unique non-invertible solution
( Z_t= - phi/thetaX_t+(theta+phi)Sum[(-theta)^{-j-1}*X_{t+j} ; j=1,…,inf] )
If |theta|=1, then:
Both Sum[ |theta|^j ; j=1,..,inf]=Sum[ |phi|^j ; j=1,..,inf]=inf and there is representation as in the other cases [ie the hypothetical solution is non-invertible]

65
Q

E[(bar(X)_n-mü)^2]=??

A

E[(bar(X)_n-mü)^2]=1/nSum[ (1-|h|/n)gamma_X(h) ; h=-n,…,n ]

BTW: ?
E[bar(X)_n]=1/n^2*Sum[ gamma_X(i-j) ; i,j=1,…,n ]

66
Q

Under which assumption:

lim_n n*Var(bar(X)_n)=Sum[ gamma_X(h) ; h=-inf,…,inf ]

A

Sum[ |gamma_X(h)| ; h=-inf,…,inf ] finite

BTW: equiv to assumption above implies Var(bar(X)_n) converges to 0 at rate 1/n

67
Q

State the asymptotics of bar(X)_n if:

  • {X_t}_t is a stationary Gaussian time series
  • {X_t}_t is an ARMA time series
A
  • sqrt(n)(bar(X)_n-mü) converges in d towards:
    N(0,n
    Var(bar(X)_n), remembering that nVar(bar(X))_n=Sum[ (1-|h|/n)gamma_X(h) ; j=-n,…,n ]
  • sqrt(n)*(bar(X)_n-mü) converges in d towards,
    N(0,Sum[ gamma_X(h) ; j=-inf,…,inf ]

i.e. bar(X)_n is approx. normal

BTW: Gaussian time series is a time series whose joint distributions are always normal

68
Q

Consistent estimator of gamma_X(h)?

State the asymptotic (a-alpha)-confidence interval for mü

A

v^_n:=Sum[ (1-|h|/sqrt(n))*gamma^_n(h) ; |h| leq sqrt(n) ], where gamma^_X is the sample ACVF

(bar(X)n - z{1-alpha/2} * sqrt(v^_n / n), bar(X)n + z{1-alpha/2} * sqrt(v^_n / n) )

69
Q

State the asymptotic result on the estimation of the ACF via the sample ACF.

A

Let h in N, rho_h=(rho(1),…,rho(h))^T and rho^_h=(rho^(1),…,rho^(h))^T.
Then as n diverges: sqrt(n)(rho^h-rho_h) converges in distribution to N(O,W) with W the hxh-covariance matrix defined by Bartlett’s formula:
W
{ij}=Sum[ (rho(k+i)+rho(k-i)-2
rho(i)rho(k))(rho(k+j)+rho(h-j)-2rho(i)rho(k)) ; k=1,…,inf]

70
Q

What is the best linear prediction of X_{n+h} given {X_n,…,X_1} in an L^2-sense i.e. a^ and b^ s.t.
E[ ( X_{n+h} - a^0 - a^1 * X_n -…- a^_n * X_1 )^2 ]=min{a,b} E[ ( X{n+h} - a^_0 - a^_1 * X_n -…- a^n * X_1)^2 ] ?
Notation: P(X
{n+h}|X_n,…,X_1)

A

(Performing standard optimization, ones gets:)

a^_0=mü(1-Sum[a^_j ; j=1,…,n]

gamma_X(h+j-1)=Sum[ a^_igamma_X(i-j) ; i=1,…,n ] or equivalently
Gamma_n*a^_n=gamma_n(h), in which a^_n=(a^_1,…,a^_n)^T, Gamma_n=[ gamma_X(i-j) ; i,j=1,…,n] an (nxn)-matrix and gamma_n(h)=(gamma_X(h),…,gamma_X(h+n-1))^T.

71
Q

State the properties of the best linear predictor: P(X_{n+h}|X_n,…,X_1)=:P_n

A
  • E[P_n]=mü
  • E[(X_{n+h} - P_n)^2]=gamma_X(0)-a^_n^T*gamma_n(h), in which a^_n=(a^_1,…,a^_n)^T and gamma_n(h)=(gamma_X(h),…,gamma_X(h+n-1))^T
  • P_n is a.s. unique (i.e. the definite equation only has one solution a.s.)
72
Q

State the properties of P(Y|W).

A
  • P(Y|W)=mü_Y+(a^)^T(W - E[W]), where Gammaa^=gamma w. gamma = Cov(Y,W), Gamma= Cov(W,W)
  • E[(Y-P(Y|W))^2]=Var(Y)-(a^)^T*gamma
  • Properties of the operator U to P(U|W) (given [E[U^2] and E[W^2] are finite]
  • -E[(U-P(U|W))*W]=0
  • -E[P(U|W)]=E[U]
  • -E[(U-P(U|W))^2]=Var(U)-a^T*Cov(U,W)
  • -operator is linearity
  • -P(W_i|W)=W_i
  • -If Cov(U,W)=0, then P(U|W)=E[U]
  • -P( P(U|W,N) | W)=P(U|W) for any random vector on Omega s.t. E[N*N^T] finite

Notation:
P(Y|W) is the best linear prediction of Y given the r.v.s {W_1,…,W_n} s.t. Cov(W_i,W_j) and Cov(W_i,Y) exist.
mü_Y=E[Y] and mü_i=E[W_i].
W=(W_n,…,W_1)^T and Mü_W=(mü_n,…,mü_1)^T and gamma=Cov(Y,W)=(Cov(Y,W_n),…,Cov(Y,W_1))^T and Gamma=Cov(W,W) iff Gamma_{ij}=Cov(W_{n+1-i},W_{n+1-j})

73
Q

What is P(X_{n+1}|X_n,…,X_1) for {X_t}_t~AR(p)?

A

If {X_t}t~AR(p) then X_t-phi_1X_{t-1}-…-phi_pX{t-p}=Z_t, then
P(X_{n+1}|X_n,…,X_1)=phi_1X_{t-1}+…+phi_pX_{t-p}+P(Z_t|X_n,…,X_1)=phi_1X_{t-1}+…+phi_pX_{t-p}

74
Q

X_t+0.25X_{t-1}-3/8X_{t-2}=Z_t+0.5Z_{t-1}-0.5Z_{t-2} is a ARMA(2,2) model.

State the explicit solution

A

False, actually ARMA(1,1): (1+0.75B)X_t=(1+B)Z_t
slides 3&4 in lecture 11

X_t=Z_t+0.25Sum[ (-0.75)^{j-1}Z_{t-j} ; j=1,…,inf ]

75
Q

State the theorem concerning ARMA processes, uniqueness of their solutions and causality.

A

Let {X_t}_t be an ARMA(p,q) model (but not necessarily real coeffs?).
Then, the model admits a (a.s.) unique stationary solution {X_t}_t if and only if phi(z)!=0 for all z in C with |z|=1 (i.e. admits no roots on the unit circle).
Furthermore, {X_t}t is causal, that is, admits the representation: X_t=Sum[ psi_j * Z{t-j} ; j=0,…,inf ] for some absolutely summable sequence {psi_j}_j
if and only if: phi(z)!=0 for all z in bar(B)_1(0) (i.e. phi has no roots in the unit disk).

76
Q

Let {X_t}_t be a causal ARMA(p,q) model

What is the unique solution?

A

X_t=psi(B)Z_t=Sum[ psi_j*Z_{t-j} ; j=0,…,inf ]

with psi_j=theta_j+Sum[ phi_k*psi_{j-k} ; k=1,…,min(p,j) ] for j geq 0

77
Q

When is an ARMA(p,q) model invertible?

A

An ARMA(p,q) model is invertible iff theta(z)!=0 for all z s.t. |z| leq 1 (i.e theta has no roots in the unit disk)

BTW: If there are {pi_j}j s.t. Z_t=Sum[ pi_j * X{t-j} ; j=0,..,inf ], then {X_t}_t~ARMA(p,q) is invertible

78
Q

Computation of ACVF of a causal ARMA model?

A

gamma_X(h)=sig^2Sum[ psi_jpsi_{j+h} ; j=0,…,inf ]

79
Q

State the proposition about the definition of the PACF of an ARMA model.

A

Suppose that {X_t}t is some 0-mean stationary time series such that gamma_X(0) greater than 0 and lim_h gamma_X(h)=0.
Then for all h greater than 1, Gamma_h is invertible (i.e. the PACF well defined) and:
alpha(h)=Corr(X
{h+1}-P(X_{h+1}|X_h,…,X_2), X_1-P(X_1|X_2,…,X_h)).

80
Q

State the basic properties of spectral densities

A

a) f is even
b) f is non-negative
c) gamma(k)=Int [ exp(iklambda)f(lambda) ; lambda in [-pi,pi] ]=Int [ cos(klambda)*f(lambda) ; lambda in [-pi,pi] ]
i. e. gamma(h) is the Fourier coefficient of the spectral density ( when {gamma(h)}_h is absolutely summable)

81
Q

Characterize spectral densities

A

A real-valued function f defined on [-pi,pi] is the spectral density of some real-valued stationary process if and only if:

i) f is even
ii) f is greater than 0
iii) Int[ f(lambda) ; lambda in [-pi,pi] ] is finite

Let gamma be a function defined on Z s.t. Sum[ |gamma(h)| ; h=-inf,…,inf ].
Then gamma is the ACVF of some stationary time series iff it is even and f(lambda)=1/(2pi)Sum[ exp(-ihlambda)*gamma(h) ; h=-inf,…,inf ] geq 0, in which case f is the spectral density of gamma.

82
Q

All ACVF’s admit a spectral density.

A

False, counterexample:
X_t=Acos(wt)+Bsin(wt) with Cov(A,B)=0, E[A]=E[B]=0 and Var(A)=Var(B)=1
(Is gamma(0)=Int[f] by Riemann-Lebesgue lemma, then lim_h cos(w*h)=0, which is false)

83
Q

State the theorem about the spectral representation of the ACVF

A

A function gamma defined on Z is the ACVF of a stationary time series if and only if:
there exists a right-continuous, non-decreasing bounded function F on [-pi,pi] s.t. F(-pi)=0 and
gamma(h)= Int[ exp(ihlambda) ; F(lambda) measure for lambda in [-pi,pi] ]

BTW: F is a generalized distribution i.e. F(lambda)/F(pi) is a distribution
gamma(0)=F(pi)
(Riemann-Stieltjes integral:
Int[ f ; g(x) measure]=lim_n Sum[ f(x_i)*( g(x_i) - g(x_{i+1}) ) ; for i=1,..,n ] )

84
Q

Let X_t=Sum[ (A_jcos(w_jt)+B_jsin(w_jt)) ; j=1,…,k ] with w_1,…,w_k in [-pi,pi], A_1,…,A_k,B_1,…,B_k are all uncorrelated and E[A_j]=E[B_j]=0, Var(A_j)=Var(B_j)=(sig_j)^2
What is the ACVF of {X_t}_t?
What is the spectral distribution=

A

gamma_X(h)=Sum[ (sig_j)^2cos(w_jh) ; j=1,…,k]

F(lambda)=Sum[(sig_j)^2F_j(lambda); j=1,…,k]
F_j(lambda)=0.5
1{lambda in [-w_j,w_j]}+1{lambda geq w_j}.

85
Q

What is the spectral density function of White Noise?

A

sig^2/(2*pi) ( see L15S16)

86
Q

Define periodogram of a sample {x_1,…,x_n}.

A

The periodogram of {x_1,…,x_n} is the function:

I_n(lambda)=1/n|Sum[ x_texp(-itlambda) ; t=1,…,n] |^2

87
Q

State the proposition about the periodogram and the sample ACVF.

A

If x_1,…,x_n in R and w_k=2pik/n for k in {-floor((n-1)/2),…,floor(n/2)} - {0}, then:
I_n(w_k)=Sum[ gamma^(h)exp(-ih*w_k) ; |h|=0,…,n-1 ], where gamma^ is the sample ACVF (based on x_1,…,x_k) and I_n is the periodogram.

BTW: given the above it is tempting to take I_n(lambda)/(2*pi) as the estimator for f(lambda) but it is not a consistent estimator

88
Q

Define consistency of an estimator

A

lim_n P(|theta^-theta|-eps)=0 for all epsilon greater than 0

89
Q

Define the discrete spectral average estimator….

to be updated

A

f^(lambda)=1/(2pi)Sum[ W_n(j)*I_n(

Let I_n be the periodogram

90
Q

Define linear filter.

Define time invariance.

Define causal in this context.

A

We say that a process {Y_t}t is the output of a linear filter={c{t,k} : t,k in Z} applied to an input process {X_t}t if:
Y_t=Sum[ c
{t,k}*X_k ; k=-inf,…,inf ]

The filter C is said to be time invariant if c_{t,t-k} is independent of t:
c_{t,t-k}=psi_k.

NOTE: In this case: Y_t=Sum[ psi_k * X_{t-k} ; k=-inf,..,inf]

The TLF is said to be causal if psi_j=0 for all negative j.

91
Q

State the proposition on time invariant linear filters.

Hint: this is the result about multiplicatively related densities

A

Let {X_t}t be a stationary time series with mean 0 and spectral density f_x.
Let {ps_j}
{j in Z} be an absolutely summable time invariant linear filter.
Then Y_t=Sum[ psi_jX_{t-j} ; j=-inf,..,inf ] defines a stationary process with mean 0 and spectral density f_Y(lambda)=|Psi(exp(-ilambda)|^2f_X(lambda) where the transfer function Psi is defined as: Psi(exp(-ilambda))=Sum[ psi_jexp(-ilambda*j) ; j=-inf,..,inf ].

BTW: lambda onto |Psi(exp(-ilambda)|^2=Psi(exp(-ilambda))Psi(exp(ilambda))* is the power transfer function

92
Q

What is the spectral density of an ARMA(p,q)?

A

f_X(lambda)=sig^2/(2pi)|theta(exp(-ilambda))/phi(exp(-ilambda))|^2

93
Q

For which time series does one use the Yule-Walker algorithm?

State the algorithm.

A

For {X_t}t~AR(p), i.e. X_t-phi_1X_{t-1}-…,-phi_pX{t-p}=Z_t

[Unknown order p, guess order m]
Phi^_m= (R^_m)^{-1} * rho^_m,
(sig^)^2 = gamma^(0) * ( 1 - (rho^_m)^T(R^_m)^{-1}rho^_m),

where Gamma^_m=[gamma^_x(i-j) ; i,j=1,…,m],
R^_m=Gamma^_m/gamma^(0)=[rho^(i-j)]
Phi^_m=(phi^_1,…,phi^_m)^T,
gamma^_m=(gamma^_X(1),…,gamma^_X(m))^T,
rho^_m=gamma^_m/gamma^(0)

Then: Y_t-phi^1Y_{t-1}-…-phi^_mY{t-m}=W_t, {W_t}_t~WN(0,(sig^)^2) is our model of {X_t}_t

NOTE: sig^ depends on m!

94
Q

State the asymptotic result concerning the Yule-Walker estimator.

A

For any m geq p, it can be shown that:
sqt(n)(Phi^_m-phi_m) converges in distribution to N(0,sig^2(Gamma_m)^{-1}),
where Phi_m=(Gamma_m)^{-1}gamma_m=(R_m)^{-1}rho_m= (phi_1,…,phi_p) if p=m and (phi_1,…,phi_p,0,…,0) else.

In particular, if m greater than p then:
sqrt(n)*phi^_m converges in distribution to N(0,1)

[ Since it converges to N(0,sig^2*[(Gamma_m)^{-1}]{m,m}) and [(Gamma_m)^{-1}]{m,m}=1/sig^2 ]

95
Q

State an order estimator for the Yule-Walker method.

A

p^=min{ k in N : sqrt(n)*|Phi^{(k+1)(k+1)}| leq z{1-alpha/2} }, where z_{1-alpha/2} is the (1-alpha/2)-quantile of N(0,1)

96
Q

Define ARIMA processes.

When is an ARIMA process stationary?

A

If d geq 0 is an integer, then {X_t}_t is an ARIMA(p,d,q) process if Y_t=(1-B)^d*X_t is a causal ARMA(p,q) process.

This implies that {X_t}_t satisfies the equations:
phi(B)(1-B)^dX_t=theta(B)Z_t, where {Z_t}_t~WN(0,sig^2).
One rewrites this:
phi
(B)X_t=theta(B)Z_t with phi(z)=phi(z)(1-z)^d.
phi* is an AR polynomial of degree p+d.
phi* has 1 as a root of multiplicity d.

It is stationary if d=0. Since phi(B)X_t=theta(B)Z_t admits a (a.s.) unique stationary solution iff phi(z) admits no roots on the unit circle. This can only be the case when d=0.

97
Q

What can you say about the decay of gamma^(h) in an ARIMA model?

A

It is slow:

E[gamma^(h)]=lim_n sig^2*(n-|h|)/6

98
Q

What is the unit root problem in ARIMA?

A

The unit problem arises when either the AR polynomial or the MA polynomial has a root on or near the unit circle.

99
Q

Motivate the (G)ARCH models.

A

P_t = closing price on day t
X_t=log(P_t) (=log asset price)
Z_t=X_t - X_{t-1} (=log return)
Empirical evidence suggests that this is not indep. WN and variance of Z_t depends on past realizations

BTW: If {X_t}_t causal invertible ARMA then Var(X_t|X_s : s leq t-1)=Var(Z_t)=sig^2

100
Q

Define an ARCH(p) process.

What does CH stand for?

A

{Z_t}t~ARCH(p) if Z_t=sqrt(h_t)e_t with {e_t}_t ~ IID-N(0,1), where h_t=alpha_0+Sum[ alpha_iZ{t-i}^2 : i=1,…,p ] (=volatility) with alpha_0 positive and alpha_i geq 0 for all i=1,…,p.

C=conditional, H=heteroscedasticity

101
Q

Define the GARCH(p,q) model.

A

{Z_t}t ~ GARCH(p,q) if:
Z_t=sqrt(h_t)e_t with {e_t}_t ~ IID(0,1) (either N(0,1) or scaled t-distr.=t_{nü}sqrt((nü-2)/nü), nü in (2,inf)), where h_t=alpha_0+Sum[ alpha_iZ_{t-i}^2 : i=1,…,p ]+Sum[ beta_ih
{t-j}^2 : j=1,…,q ]
with alpha_0 positive, alpha_i geq 0 for all i=1,…,p and beta_j geq 0 for all j=1,…,q.

BTW: The scaled t-distribution is taken to ensure that the variance is 1

102
Q

Under what conditions can we derive a solution for ARCH(1)?

What is the stationary solution of ARCH(1)?

A

Causality (i.e. Z_t depends only on Z_s for s below t) and alpha_1 in [0,1) implies uniqueness almost surely:

Z_t = sqrt( alpha_0 ( 1+ Sum[ (alpha_1)^j(e_{t-1})^2—*(e_{t-j})^2 ; j=1,…,inf ] ) * e_t almost surely

103
Q

State the basic properties of the causal ARCH(1) model assuming alpha_1 in [0,1).

A

a) {Z_t}_t is strictly stationary
b) E[Z_t]=0
c) Var(Z_t)=E[Z_t^2]=alpha_0/(1-alpha_1)
d) {Z_t}_t is (weakly) stationary
e) gamma_Z(h)=0 for all positive h

104
Q

What can you say about ARCH(1) and distributions?

A

a) {Z_t}t ~ ARCH(1) therefore implies {Z_t}t ~ WN(0,alpha_0/(1-alpha_1))
b) {Z_t}_t is NOT independent WN (i.e. only uncorrelated):
E[Z_t^2|Z
{t-1}]=E[(alpha_0+alpha_1Z_{t-1}^2|Z_{t-1}]=alpha_0+alpha_1Z
{t-1}^2
c) {Z_t}_t is not Gaussian but symmetric around 0

105
Q

What can you say about:

a) Z_n|Z_{n-1},…,Z_1 ?
b) Z_2,…,Z_n|Z_1 ?
c) Likelihood function?
d) How do we do MLE?

A

a) (Z_n|Z_{n-1},…,Z_1)=^d (Z_n|Z_{n-1}) ~ N(0,alpha_0+alpha_1Z_{n-1}^2)
b) The density is:
f_{Z_2,…,Z_n|Z_1} (z_2,…,z_n) = f_{Z_2|Z_1=z_1} (z_2) * — * f_{Z_n|Z_{n-1}=z_{n-1}} (z_n) = 1/(2
pi)^{(n-1)/2}Prod[ /sqrt(alpha_0+alpha_1z_{t-1}^2)exp(-z_t^2/(2(alpha_0+alpha_1*z_{t-1}^2)) ]
c) the above =L(alpha_0,alpha_1)
d) We then maximize L above over (0,inf)x[0,inf) (ie (0,inf)x[0,1) due to causality)

106
Q
A function K is defined on Z is the ACVF of some stationary process iff:
0 K is even
0 K is even and non-positive definite
0 K is non-negative definite
0 K is even and non-negative definite
0 K is non-negative
A

K is even and non-negative definite

107
Q
Consider the stationary process defined by the equations X_t=Z_t+0.5*Z_{t-1} wuth {Z_t}_t~WN(0,1).
The ACVF of {X_t}_t is given by:
0 gamma_X(h)=5/4*1_{h=0} + 1/2*1_{|h|=1}
0 gamma_X(h)=5/4*1_{h=0} - 1/2*1_{|h|=1}
0 gamma_X(h)=1/2*1_{h=0} + 5/4*1_{|h|=1}
0 gamma_X(h)=5/4*1_{h=0}
0 gamma_X(h)=5/4*1_{h=0} + 1/2*1_{|h|=2}
A
(not more than 3 minutes: spot the trap)
0 gamma_X(h)=5/4*1_{h=0} + 1/2*1_{|h|=1}
108
Q
Consider K(h)=cos(pi/4*h), h in Z. Then, K is the ACVF of:
0 X_t=A*cos(pi/4*t) w. E[A]=0 and Var(A)=1
0 X_t=A*sin(pi/4*t) w. E[A]=0 and Var(A)=1
0 X_t=A*cos(pi/4*t)+B*sin(pi/4*t) w. E[A]=E[B]=0, Var(A)=Var(B)=1 and Cov(A,B)=0
0 X_t=A*cos(pi/4*t)
0 X_t=A*sin(pi/4*t)
A

(either compute or rely on your memory, you see one then you compute and see it isn’t going anywhere etc.)
X_t=Acos(pi/4t)+Bsin(pi/4t) w. E[A]=E[B]=0, Var(A)=Var(B)=1 and Cov(A,B)=0

109
Q
Let gamma_X(h) be the ACVF of some stationary process {X_t}_t s.t. Sum[ |gamma_X(h)| : h=-inf,..,inf ] is finite.
Then, the spectral density of {X_t}_t is:
0 f(lambda)=1/(2*pi)*Sum[ |gamma_X(h)|*exp(-i*h*lambda); h=-inf,..,inf ]
0 f(lambda)=1/(2*pi)*Sum[ |gamma_X(h)|*exp(-i*2*h*lambda); h=-inf,..,inf ]
0 f(lambda)=1/(2*pi)*(gamma_X(0)+2*Sum[ gamma_X(h)*cos(lambda*h); h=1,..,inf ]
0 f(lambda)=1/pi*Sum[ gamma_X(h)*exp(-i*h*lambda); h=-inf,..,inf ]
0 f(lambda)=1/pi*Sum[ gamma_X(h)*exp(i*h*lambda); h=-inf,..,inf ]
A
(You need to know the def and think a bit)
(doesn't need |.|, 2 is a joke, 2 missing, last two equal so they can be excluded)
0 f(lambda)=1/(2*pi)*(gamma_X(0)+2*Sum[ gamma_X(h)*cos(lambda*h); h=1,..,inf ]
110
Q

What is a sufficient condition for an (X_t)_t to be MA(q) for some q?

A

If X is stationary, q-correlated and has 0-mean, then it can be represented as an MA(q) process.

111
Q

Spectral density of AR(1)?

Try spectral density of MA(1)?

A

f(lambda)=sig^2/(2pi)1/(1+phi^2-2*cos(lambda))

update