Time Series Analysis Flashcards
Define ACVF.
The autocovariance function (ACVF) of {X_t}t at lag h is defined as:
gamma_X(h)=Cov(X_t,X{t+h}).
Define ACF.
The autocorrelation function (ACF) of {X_t}t at lag h is defined as:
rho_X(h)=gamma_X(h)/gamma_X(0), where gamma_X(h)=Cov(X_t,X{t+h}) is the ACVF of {X_t}_t.
Is X_t=a+bZ_t+Z_{t-2} stationary?
Calculate the mean and the ACVF of X_t=a+bZ_t+Z_{t-2}, where {Z_t}_t is an iid N(0,sig^2) sequence and a, b in R.
Is {X_t}_t strictly stationary?
S1E1a
Define stationary.
A TS {X_t}_t is (weakly) stationary if:
i) E[X_t^2] finite
ii) mü_X(t) does not depend on t,
iii) gamma_X(t,t+h) does not depend on t (we then simply write gamma_x(h) )
Define strictly stationary.
A TS {T_t}t is strictly stationary if for all n in N and h in Z:
(X_1,…,X_n) =^d (X{1+h},…,X_{n+h}).
Is X_t=Z_tZ_{t-1} stationary?
Calculate the mean and the ACVF of X_t=Z_tZ_{t-1}, where {Z_t}_t is an iid N(0,sig^2) sequence and “t0 some integer”.
Is {X_t}_t strictly stationary?
S1E1d
Is X_t=Z_t if t leq t0 and X_t=Z_t+Z_{t-1} otherwise, where {Z_t}_t is an iid N(0,sig^2) sequence and t0 some integer stationary?
Calculate the mean and the ACVF of
Is {X_t}_t strictly stationary?
S1E1e
Define an MA(q) process.
{X_t}t ~ MA(q) if:
X_t=Z_t+theta_1Z_{t-1}+…+theta_qZ{t-q} for {Z_t}_t~WN(0,sig^2) and theta_1,…,theta_q in R.
BTW: MA(q)=moving average process of order q>=1.
We write
Define WN.
Define i.i.d. noise.
{X_t}_t is white noise if it is a sequence of uncorrelated and centered r.v.s s.t. E[X_t]=0, Cov(X_r,X_s)=0 (s!=r). (@: Finite variance)
{X_t}_t is i.i.d. noise if it is a sequence of independent and identically distributed r.v.s s.t. E[X_t]=0 for all t.
Let {X_t}t be the MA(2) process: X_t=Z_t+theta*Z{t-2}, where {Z_t}_t is WN(0,sig^2).
Compute the mean, the ACVF and ACF for this time series.
S1E2a
Let {X_t}t be the MA(2) process: X_t=Z_t+theta*Z{t-2}, where {Z_t}_t is WN(0,sig^2).
Compute the variance of the sample mean (X_1+X_2+X_3+X_4)/4 for theta=0.8.
Do the same for theta=-0.8. Note the difference.
S1E2b
S1E2c
What is k(h):=1_{h=0} for h in Z an ACF of?
{Z_t}_t~WN(0,sig^2)
What is the ACVF of an MA(1) process?
gamma_X(h)= sig^2(1+theta^2)1{h=0} + sig^2theta1{|h|=1}
What is the ACF of an MA(1) process?
rho(h)=1{h=0}+theta/(1+theta^2)*1{|h|=1}
Define linear process.
A TS {X_t}t is said to be a linear process if it has the following representation:
X_t=Sum(psi_j*Z{t-j} ; j=-inf,…,inf), where {Z_t}t~WN(0,sig^2) and {psi}_j is an absolutely convergent real sequence i.e. Sum( |psi_j| ; j=-inf,…,inf ) is finite.
BTW: the last condition ensures that X_t is finite a.s.
Define an AR(p) process.
{X_t}t~A(p) iff {X_t}t is stationary and X_t-phi_1*X{t-1}-…-phi_p*X{t-p}=Z_t with {Z_t}_t~WN(0,sig^2) and phi_1,…,phi_p in R.
Define an ARMA(p,q) process.
A time series {X_t}t is an ARMA(p,q) process if it is stationary and satisfies the equations:
phi(B)X_t=X_t-phi{t-1}X_{t-1}-…-phi_{t-p}X_{t-p}=Z_t+theta_{t-1}Z_{t-1}+…+theta_{t-q}Z_{t-q}=theta(B)Z_t, where {Z_t}_t~WN(0,sig^2), phi_p!=0!=theta_q, and the polynomials (w. real coeffs) phi and theta have no common factors/roots.
Define causality for linear processes and ARMA(p,q) models
A linear process Cov(X_t,X_s)=0 for all s greater than t, then a model of {X_t}_t is called causal.
(This implies the same definition as for ARMA(p,q) models)
{X_t}_t~ARMA(p,q) is causal if there is an absolutely summable sequence {psi_j}j s.t.
X_t=Sum[psi_j*Z{t-j} ; j=0,…,inf ] for all t.
BTW: Was defined after AR(1), MA(q) and linear processes but before ARMA(p,q) processes.
Define invertibility of ARMA(p,q) models.
An ARMA(p,q) process {X_t}_t is invertible if there exists an absolutely summable sequence {pi_j}_j s.t. Z_t=Sum[ pi_j*X_{t-j} ; t=0,...,inf ] for all t in Z.
An ARMA(1,1) model is said to be invertible if Z_t can be expressed with current and past values of {X_t}_t
Characterize causality for {X_t}_t~ARMA(p,q).
Causality is equivalent to the condition:
phi(z)=1-phi_1z-…-phi_pz^p for all |z| leq 1.
Characterize invertibility for {X_t}_t~ARMA(p,q).
Invertibility is equivalent to the condition that:
theta(z)=1+theta_1z+…+theta_qz^q != 0 for all |z| leq 1.
Is X_t=Z_1cos(ct)+Z_2sin(ct) stationary?
Calculate the mean and the ACVF of X_t=Z_1cos(ct)+Z_2sin(ct), where {Z_t}_t is an iid N(0,sig^2) sequence.
Is {X_t}_t strictly stationary?
S1E1b
State the trig identities cos(x-y) and sin(x-y)
cos(a+/-b)=cos(a)cos(b)-/+sin(a)sin(b)
sin(a+/-b)=sin(a)cos(b)+/-cos(a)sin(b)
Is X_t=Z_tcos(ct)+Z_{t-1}sin(ct) stationary?
Calculate the mean and the ACVF of X_t=Z_tcos(ct)+Z_{t-1}sin(ct), where {Z_t}_t is an iid N(0,sig^2) sequence.
Is {X_t}_t strictly stationary?
S1E1c
Show that k(h):=(-1)^{|h|} for h in Z is the ACF of a stationary time series.
S2E1a
X_t=(-1)^t*X; X mean0, var1
Show that k(h):=1+cos(pih/2)+cos(pih/4) for h in Z is the ACF of a stationary time series.
S2E1b
Show that k(h):=1{h=0}+0.4*1{|h|=1} for h in Z is the ACF of a stationary time series.
S2E1c
(State a necessary condition if there is one)
What is the ACVF of an AR(1) process?
What is the ACF of an AR(1) process?
If |phi| less than 1 (for |phi| greater than 1 not sure), then:
K(h)=sig^2/(1-phi^2)*phi^{|h|} for h in Z
rho=phi^{|h|} for h in Z
Define q-correlated.
A time series {X_t}_t is said to be q-correlated if it is stationary and gamma_x(h)=0 for all h s.t. |h| larger than q.
Let {X_t}_t be a causal AR(1) process satisfying X_t=phiX_{t-1}+Z_t for t in Z with phi in (-1,1) and {Z_t}_t~WN(0,sig_z^2). Consider {Y_t}_t defined by the equations Y_t=X_t+W_t, where {W_t}_t~WN(0,sig_w^2) s.t. Cov(Z_t,W_s)=E[Z_tW_s}=0 for all t, s in Z.
Show that the time series is stationary and find its ACVF
S2E2a
Define sample AVCF.
Let X_1,…,X_n be n obvservations of a time series.
The sample ACVF is:
gamma^(h)=1/nSum[ (X_{t+|h|}-bar(X))(X_t-bar(X)) ; t=1,…n-|h| ].
Let {X_t}_t be a causal AR(1) process satisfying X_t=phiX_{t-1}+Z_t for t in Z with phi in (-1,1) and {Z_t}_t~WN(0,sig_z^2). Consider {Y_t}_t defined by the equations Y_t=X_t+W_t, where {W_t}_t~WN(0,sig_w^2) s.t. Cov(Z_t,W_s)=E[Z_tW_s}=0 for all t, s in Z.
Show that {U_t}t defined as U_t=Y_t-phi*Y{t-1} is 1-correlated and deduce that it is an MA(1).
Hint: gamma_Y(h)=(sig_W)^21{h=0}+(sig_Z)^2phi^{|h|}/(1-phi^2)
S2E2b
Define sample ACF.
Let X_1,…,X_n be n obvservations of a time series.
The sample ACF is:
rho^(h)=gamma^(h)/gamma^(0), where gamma^ is the sample ACF.
State the density of multivariate Gaussian distributions.
f(x)=(2pi)^{-k/2}det(Sig)^{-1/2)exp( - 1/2(x-mü)^TSig^{-1}(x-mü))
Define a causal AR(1) with mean mü.
A TS {X_t}t is a causal AR(1) with mean mü if X_t satisfies:
X_t - mü = phi*(X{t-1} - mü ) + Z_t for t in Z, phi in (-1,1) and {Z_t}_t~WN(0,sig^2).
State the asymptotic result relating to ARMA and its centered sum.
If {X_t}_t is an ARMA time series, it can be shown that sqrt(n)*(bar(X)_n - mü ) converges in distribution to N(0,Sum( gamma_X(h) ; j=-inf,…,inf ).
Suppose X_n converges in distribution to N(mü,sig^2), how does one get a confidence interval for mü?
It follows that:
(X_n-mü)/sig converges in distribution to N(0,1)
{
If follows that:
lim_n P( (X_n-mü)/sig leq t)=Phi(t)
If follows that:
lim_n P( t0 leq (X_n-mü)/sig leq t1)=Phi(t1)-Phi(t0)
}
If follows that:
lim_n P( Phi^{-1}(1-alpha) leq (X_n-mü)/sig leq Phi^{-1}(1-alpha) ) converges to 1-alpha
or equivalently
lim_n P( sigPhi^{-1}(1-alpha)-X_n leq -mü leq sigPhi^{-1}(1-alpha)-X_n ) converges to 1-alpha
From which it follows that:
The probability of mü being in:
[X_n-sigPhi^{-1}(1-alpha), X_n+sigPhi^{-1}(1-alpha)]
is approximately 1-alpha for large n.
If necessary: Check with: S2E3 once next ex is complete.
Let {X_t}_t be a causal AR(1) process with mean mü. Based on n=100 observations from such a model with phi=0.6, sig^2=2 and unkown mü, we compute bar(X)_n=0.271. Construct an asymptotic 95% confidence interval for mü and indicate the result you are using.
Do you think that the observed data is compatible with the hypothesis that mü=0?
S2E3
State Bartlett’s formula and the context of its use (state the asymptotic).
S3E1a
Apply Bartlett’s formula to WN(0,sig^2).
Also use it to calculate W_{ii} of an MA(1) process {X_t}_t with paramter rho=rho_X(1).
S3E1b
Define PACF of an ARMA process.
Sample PACF?
Let {X_t}t be an ARMA process. Its PACF is the function:
alpha: h in {0,1,2,…} to alpha(h), where
alpha(0)=1 and alpha(h)=phi{hh} else,
where phi_{hh} is the last component of the vector phi_h=Gamma_h^{-1}*gamma_h with Gamma_h=[gamma_X(i-j); i,j=1,…,h] and gamma_h=(gamma_X(1),…,gamma_X(h))^T.
NOTE: Gamma_h is assumed to be invertible
For sample PACF change underlying gamma_x to gamma^_h
Source: S3E2a (or L13S10)
Let {X_t}_t~MA(1).
Show that alpha(2)=-theta^2/(theta^4+theta^2+1), where alpha=PACF.
S3E2b
Let {X_t}t~MA(1) i.e. X_t=Z_t+theta*Z{t-1}.
For which regions of theta does the innovations algorithm converge?
|theta| smaller than 1
|theta| = 0
|theta| larger than 1
|theta| smaller than 1: ???check skript and update
|theta| = 0: think or check skript ???
|theta| larger than 1: No according to S3E3c
Define spectral density.
When is some function a spectral density of a stationary time series?
Let {X_t}t be some 0-mean stationary time series with ACVF gamma=gamma_X satisfying:
Sum[ |gamma(h)| ; h-inf,…,inf ] is finite.
The spectral density of {X_t}_t is the function:
f(lamba)=1/(2pi)Sum[ exp(-ihlambda)*gamma(h) ; h=-inf,…,inf ].
NOTE:
- We can restrict the domain to [-pi,pi] since this is periodic
- Is well defined since Sum[ |gamma(h)| ; h in Z] is finite
A function f is THE spectral density of some stationary time series {X_t}_t with ACVF gamma_X=gamma if:
i) f(lambda) geq 0 for all lambda in [-pi,pi]
ii) gamma(h) = Int[ exp(ihlambda) * f(lambda) ; lambda in [-pi,pi] ]