ERM Chapter 17 Flashcards
Define strict stationarity.
- the joint distributions of Xr, Xr+1, …, Xs and Xr+k, Xr+1+k, …, Xs+k are identical for all integers r, s and k
Define weak and covariance stationarity.
- a process is said to be covariance stationary if the mean of the process m(t) = E(Xt) is constant and the covariance of the process cov(Xr, Xr+k) depends only on the time difference k.
Why is stationarity important in the context of modelling financial time series, and does strict stationarity imply weak stationarity?
- stationarity means that the statistical properties of the process stay the same over time. This determines the extent to which we can use past data to try to model the future
- for finite variance processes, strict stationarity implies weak stationarity
- for infinite variance processes, strict stationarity does not imply weak stationarity
Define white noise.
- a process that is covariance stationary and for all integers t:
> corr(Et, Et+k) = 1 for k = 0
= 0 for k ne 0
> m(t) = E(Et) = 0
State the meaning of white noise, and why it is important in the context of modelling financial time series.
- under a white noise process, every element is uncorrelated with any previous observation, and the process oscillates randomly around zero
- the concept is important because many financial processes have a random element which cannot be known in advance (white noise)
Define strict white noise.
- a white noise process that is a set of iid random variables with finite variance
- we often assume strict white noise is normally distributed N(0, o^2)
Define trend stationarity.
- a process where the observations randomly oscillate around a steadily changing value that is a function of only time
Define difference stationarity and integrated processes.
- a process whereby the difference between successive elements in the underlying process is covariance stationary i.e. Xt - Xt-1, but the underlying process is not
- an integrated process of order 2 (I(2)) is one where the second difference is covariance stationary, but the first difference is not
How might trend and difference stationary processes be distinguished?
- the Dickey-Fuller test can help distinguish between trend and difference stationary processes by regressing the first difference of the observations onto the preceding observation and a time trend
- the test statistic is compared to critical values determined by Dickey and Fuller
- if a2 is statistically different to to zero then the series is an integrated process, if not it is trend-stationary
Define autoregressive processes.
- a process whereby each value depends on the p previous values (plus a random error)
e. g. AR(1): Xt = a0 + a1 x Xt-1 + Et
Describe the influence of a1 on the properties of an AR(1) process.
- if abs(a1) < 1, then the process is mean reverting (and therefore covariance stationary) with mean a0/(1-a1) and variance o^2/(1-a1^2), where o^2 is the variance of the white noise term
- if abs(a1) > 1 then the process becomes unstable
- if abs(a1) = 1 then the AR(1) process is a random walk. If also a0 ne 0 then the process is a random walk with drift. Note that a random walk with drift is difference stationary
Describe the condition for an AR(p) process to be (at least) weakly stationary.
- the length of the p-dimensional vector containing the roots (z) of the following polynomial expression must be greater than 1:
f(z) = 1 - a1 x z - a2 x z^2 - … - ap x z^p = 0
Define a moving average process.
- a q-period moving average process is an MA(q) process given by:
Xt = Et + B1 x Et-1 + … + Bq x Et-q - the Durbin-Watson statistic (d) is used to test for serial correlation (the correlation between the values of the process at adjacent times) in an MA process
- the null hypothesis is d = 2, in which case there is no serial correlation
Define an integrated auto-regressive moving average process.
- come as a result of combining an AR(p) process and a MA(q) process:
ARMA(p, q) = Xt =(a0 + a1 x Xt-1 + …) + (Et + B1 x Et-1+…)
Discuss the fitting of ARIMA models.
- ARIMA processes can be differenced to obtain a stationary ARMA process
- for a stationary process Xt:
> autocovariance function = var(Xt)
> autocorrelation function = corr(Xt, Xt-h)
> partial autocorrelation function = conditional correlation of Xt+h with Xt given Xt+1, …, Xt+h-1. PACFs are defined for positive lags only