17. Time series analysus Flashcards
Define stationarity
- Determines extent to which can use past data to model future
List the 4 types of stationarity
- Strict stationarity
- Weak stationarity
- Covariance stationarity
- Trend stationarity
Outline the 4 types of stationarity
- Strict stationarity - Characteristics do not change over time
- Weak stationarity - Weak stationarity of order n if moments of subsets of the process = and finite up to nth moment
- Covariance stationarity - Mean and variance constant and covariance depends only on lag(weak stationarity of order 2)
- Trend stationarity - Obs oscillate randomly around a trend line a+bt that’s a function of only time
What is the Dicky-Fuller test
- Used to distinguish between trend and difference stationarity
Describe a AR(p) process
- AR(p) is a process where each obs is a linear combination of p previous values plus random error term
- In the ACF plot, you would see a gradual decrease in autocorrelation as the lag increases, indicating a long memory in the time series. (tails off)
- In the PACF plot, you would see significant values only at lags up to the order of the autoregressive model (p), beyond which it drops to zero or becomes insignificant. (cuts off)
Describe a MA(q) process
Definition
* MA(q) where each observation is linear combination of q previous error terms plus a current random error term
* n the ACF plot, you would see significant values only at lags up to the order of the moving average model (q), beyond which it drops to zero or becomes insignificant. (cuts off)
* In the PACF plot, you would see a gradual decrease in partial autocorrelation as the lag increases, indicating a long memory in the residuals. (tails off)
Describe a ARMA(p,q) process
- ARMA(p,q), observation is sum of AR(p) and MA(q)
- In both the ACF and PACF plots, you would see a gradual decrease in autocorrelation and partial autocorrelation as the lag increases, indicating a long memory in the time series and residuals. (tails off)
- There may be kinks or changes in behaviour at lags corresponding to the order of the autoregressive (p) and moving average (q) components of the model.
Describe a ARIMA(p,d,d) process
- ARIMA(p,d,q) one where d’th difference is ARMA(p,q) process
Explain how ARIMA models can be fitted
Correlogram – plot of ACF
Can plot PACF
Can observe behaviours (cutting off or decaying) of plots for various degrees of integration, d = 0,1,2… giving an indication as to type of ARIMA(p,d,q) model to fit
Can compare fit:
AIC
BIC
Likelihood ratio tests
Can test white noise features:
Use turning point / portmanteau»_space; perform tests on calculated residuals ε ̂_t to check if they exhibit features of white noise
Serial correlation of residual can be tested using Durban-Watson stat
Can use model to predict after fitting
Describe a ARCH process
(Autoregressive conditional heteroscedastic):
* Constructed so variance changes over time
* Volatility clustering: Large change in previous values of process is often followed by period of high volatility
Describe a GARCH process
(Generalised autoregressive conditional heteroscedastic):
* Constructed so that volatility depends on previous volatility and previous values of the process.
* Periods of high volatility usually last long time