ERM Chapter 17 Flashcards

1
Q

Define strict stationarity.

A
  • the joint distributions of Xr, Xr+1, …, Xs and Xr+k, Xr+1+k, …, Xs+k are identical for all integers r, s and k
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Define weak and covariance stationarity.

A
  • a process is said to be covariance stationary if the mean of the process m(t) = E(Xt) is constant and the covariance of the process cov(Xr, Xr+k) depends only on the time difference k.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Why is stationarity important in the context of modelling financial time series, and does strict stationarity imply weak stationarity?

A
  • stationarity means that the statistical properties of the process stay the same over time. This determines the extent to which we can use past data to try to model the future
  • for finite variance processes, strict stationarity implies weak stationarity
  • for infinite variance processes, strict stationarity does not imply weak stationarity
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Define white noise.

A
  • a process that is covariance stationary and for all integers t:
    > corr(Et, Et+k) = 1 for k = 0
    = 0 for k ne 0
    > m(t) = E(Et) = 0
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

State the meaning of white noise, and why it is important in the context of modelling financial time series.

A
  • under a white noise process, every element is uncorrelated with any previous observation, and the process oscillates randomly around zero
  • the concept is important because many financial processes have a random element which cannot be known in advance (white noise)
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Define strict white noise.

A
  • a white noise process that is a set of iid random variables with finite variance
  • we often assume strict white noise is normally distributed N(0, o^2)
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Define trend stationarity.

A
  • a process where the observations randomly oscillate around a steadily changing value that is a function of only time
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Define difference stationarity and integrated processes.

A
  • a process whereby the difference between successive elements in the underlying process is covariance stationary i.e. Xt - Xt-1, but the underlying process is not
  • an integrated process of order 2 (I(2)) is one where the second difference is covariance stationary, but the first difference is not
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

How might trend and difference stationary processes be distinguished?

A
  • the Dickey-Fuller test can help distinguish between trend and difference stationary processes by regressing the first difference of the observations onto the preceding observation and a time trend
  • the test statistic is compared to critical values determined by Dickey and Fuller
  • if a2 is statistically different to to zero then the series is an integrated process, if not it is trend-stationary
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Define autoregressive processes.

A
  • a process whereby each value depends on the p previous values (plus a random error)
    e. g. AR(1): Xt = a0 + a1 x Xt-1 + Et
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Describe the influence of a1 on the properties of an AR(1) process.

A
  • if abs(a1) < 1, then the process is mean reverting (and therefore covariance stationary) with mean a0/(1-a1) and variance o^2/(1-a1^2), where o^2 is the variance of the white noise term
  • if abs(a1) > 1 then the process becomes unstable
  • if abs(a1) = 1 then the AR(1) process is a random walk. If also a0 ne 0 then the process is a random walk with drift. Note that a random walk with drift is difference stationary
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Describe the condition for an AR(p) process to be (at least) weakly stationary.

A
  • the length of the p-dimensional vector containing the roots (z) of the following polynomial expression must be greater than 1:
    f(z) = 1 - a1 x z - a2 x z^2 - … - ap x z^p = 0
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Define a moving average process.

A
  • a q-period moving average process is an MA(q) process given by:
    Xt = Et + B1 x Et-1 + … + Bq x Et-q
  • the Durbin-Watson statistic (d) is used to test for serial correlation (the correlation between the values of the process at adjacent times) in an MA process
  • the null hypothesis is d = 2, in which case there is no serial correlation
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Define an integrated auto-regressive moving average process.

A
  • come as a result of combining an AR(p) process and a MA(q) process:
    ARMA(p, q) = Xt =(a0 + a1 x Xt-1 + …) + (Et + B1 x Et-1+…)
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Discuss the fitting of ARIMA models.

A
  • ARIMA processes can be differenced to obtain a stationary ARMA process
  • for a stationary process Xt:
    > autocovariance function = var(Xt)
    > autocorrelation function = corr(Xt, Xt-h)
    > partial autocorrelation function = conditional correlation of Xt+h with Xt given Xt+1, …, Xt+h-1. PACFs are defined for positive lags only
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

What is a correlogram?

A
  • a plot of the sample autocorrelation function
  • separate autocorrelation functions might be constructed for a variety of degrees of integration to check visually for both serial correlation, the degree of integration and the form of model to fit
  • observing patterns in the values of the sample ACF and PACF can assist in choosing an appropriate model to fit. e.g. observing that the ACF suddenly falls below a significant level, and can therefore be regarded as insignificant after that lag, indicating an MA(3) process perhaps
17
Q

How may error terms be used to test the fit of a particular model?

A
  • error terms can be tested to see whether there is any residual structure i.e. whether they are white noise
18
Q

How is seasonality modelled?

A
  • seasonality is modelled using dummy (indicator) variables:

e. g. Xt = a0 + a1 x d1 + a2 x d2 + a3 x d3 + a4 x t + Et

19
Q

How are step changes modelled?

A
  • a jump in the value of a process can be modelled using a poisson variable
20
Q

How is an altered rate of change modelled?

A
  • one or both of the drift or degree of mean-reversion in an AR model might be dependent upon time
  • alternatively, the rate of trend in a trend-stationary model might instead be dependent upon time
  • such time dependencies can be identified via the Chow test. This involves splitting the data in two either side of a suspected break, and performing the Chow test
21
Q

What is heteroskedasticity?

A

A series where the variance changes over time. Models include ARCH and GARCH models.

22
Q

Outline ARCH models.

A

Autoregressive heteroskedasticity models are based on a strictly stationary white noise process with zero mean and unit standard deviation. However, the ARCH process is constructed so the standard deviation varies over time:
Xt = ot x Zt, where Zt is strictly white noise

  • volatility clustering occurs as the conditional variance is a function of the previous squared values of the processs. Large changes are often followed by a period of high volatility
23
Q

Outline GARCH models.

A

Generalised ARCH models, similar to ARCH models but the volatility is now allowed to depend on previous values of volatility as well as previous values of the process:
Xt = ot x Zt, where Zt is strictly white noise

  • with a GARCH model, periods of high volatility tend to last for a long time
  • most common technique for fitting GARCH models is the method of maximum likelihood
  • other types of models include the ARMA model with GARCH errors
24
Q

How to we allow for insufficient data?

A
  • Provided data is iid, we can multiply the means and covariances by the number of time periods in the period required e.g. x12 when converting months to a year
  • Volatility can be derived by multiplying by the sqrt of the number of periods e.g. sqrt(12) for the above example
25
Q

How do we check model fits of a GARCH model?

A
  • Check the goodness of fit by examining the residuals:
    > unstandardised residuals should look like a realisation of a pure GARCH process
    > standardised residuals should look like strict white noise
  • if insufficient evidence to reject the null hypothesis of white noise, validity of distribution used to construct the likelihood function can be investigated using QQ plots and goodness-of-fit tests for the normal or scaled t-distributions