Stationary Flashcards
What is a stochastic process?
A random process that evolves over time e.g. x(t) = θx(t-1) + ε(t) (this is an AR process)
Is the probability distribution of ε(t) dependent or independent of time?
It is independent of time
How do we write and AR process in MA form?
By backward substitution we have: x(t-1) = θx(t-2) + ε(t-1) x(t-2) = θx(t-3) + ε(t-2)… we come to: x(t) = ε(t) + θε(t-1) + θ²ε(t-2) … -> ∑θ(i)ε(t-i)
What assumptions do we make about this white noise?
- It is Gaussian
- its expected value is 0 with constant variance, following a normal distribution.
- The error terms are uncorrelated i.e. E(ε(t),ε(t-1))=0
What is the necessary condition for an AR(1) process to be stationary?
Autocorrelations must approach zero exponentially - |θ| < 1 In order for |θ| < 1, L > 1 (the invertibility condition)
How do we calculate the value of Φ(1) and Φ(2)?
Through expansion thing:
- Find the equation with the lag operators (i.e. 1-0.1L+1.2L²=0)
- Find the value for L(1) and L(2)
- Find Φ(1) and Φ(2) through the equation (1-Φ₁L)(1-Φ₂L)=0
Is it possible to estimate the parameters of the process given that we only observe one realisation?
Yes, assuming the process is ergodic i.e. if the sample moments of a particular realisation approach the population moments of the process as the length of the realisation becomes infinite
Is it possible to test for ergodicity?
No, so we really on the weaker property of stationarity
What is a strictly stationary stochastic process?
A stochastic process whose properties are unaffected by a change of time origin i.e. x(1) at t(1) must have the same joint probability distribution as x(1+k) at t(1+k)
What is a weak or covariance stationary stochastic process?
One where: E[x(t)] is constant across time V[(x(t)] is constant across time Cov[x(t),x(t-k)] and Corr[x(t),x(t-k)] depend only on the lag k but not time
What’s the difference between Autocorrelation and Partial-Autocorrelation?
PAC uses the method of moments to compute, whereas AC uses OLS
With AC, we compute the AC between x(t)→x(t-1), then x(t)→x(t-2) directly, the PAC takes into account all the steps in between x(t)→x(t-k)
Are MA(1) processes stationary?
Yes because the value variance does not vary over time, nor does covariance
Why are PACs useful?
They are useful to capture the correlation between the series and its past value while allowing for the intermediate effects of lags
What are the Yule-Walker equations?
ρ(1) = θ(1) + θ(2)ρ(1) ρ(2) = θ(1)ρ(1) + θ(2)
What is the calculation for correlation? (ρ)?
ρ(1) = γ(1)/γ(0) ρ(2) = γ(2)/γ(0)
OR
ρ(1) = θ(1)/1-θ(2) ρ(2) = θ(1)²/(1-θ(2)) + θ(2)
where γ(k) is the covariance between x(t),x(t-1) and γ(0) is variance of x(t)
What is the MA(1) equation?
x(t) = ε(t) + αε(t-1) It is a function of the error terms and their past values weighted by the value, α
What is the expected value of x(t) in an MA(1) process?
0
What is the variance (γ(0)) of the process?
σ²(1+α²)
What is the covariance (γ(1)) = E[x(t)x(t-1)] of the process?
ασ²
What is the covariance (γ(k)) = E[x(t)x(t-k)] of the process?
0, where k≥2
Is the MA process weakly dependent?
Yes, because the autocorrelation moves quickly to 0
What is the Stationarity condition of an ARMA(p,q) process?
|θ|<1 where: θ(L) = (1-Φ₁L)(1-Φ₂L)…=0
What is the invertibility condition of an ARMA(p,q) process?
where: α(L) = (1-h₁L)(1-h₂L)…=0
|h|<1, L>1
What are the steps to the Box-Jenkins method to fit an ARIMA process to a model?
- Identification (examine the correlogram) 2. Estimation (Fit an AR(1) process to the data) 3. Diagnostic Checking (examine correlogram of the residuals (again))
What is the Box-Jenkins philosophy? (i.e. how to choose which model is the best fit if a few work?)
Select the most parsimonious one - the one with the fewest parameters i.e. an ARMA(1,1) is preferred to an ARMA(3,0)
How do we identify a unit root process?
The AC will decrease linearly and very slowly
Is Bayesian Information Criterion (BIC) stricter than Akaike Information Criteria (AIC)?
Yes, because it leads to lower values of p and q being chosen in the ARMA(p,q) model
How do we choose the best AIC and BIC?
When estimating a range of ARMA(p,q) models we look for the lowest values in AIC and BIC
How do we assess the performance of a forecasting model? (2 methods)
Root Mean Squared Errors
Theil’s U statistic
What is the formula for RMSE?
√(1/k)∑(x-x^) This compares different models (note: the √ is over everything) x is the observed value and x^ is the forecasted value
What is the formula for Theil’s U statistic?
RMSE/(√(1/k)∑x²)+(√(1/k)∑x^²) if U = 0, model predicts the data perfectly U = 1, negative/opposite relation between actual and predicted values
How do static and dynamic forecasts differ depending on how far ahead we are forecasting?
If it is only 1 period ahead, static = dynamic If we are forecasting two or more ahead: -Static → substitute actual values into the forecasting equation → information set is lagged only one period in the past → x is used to forecast x^ -Dynamic → substitute forecast values into the forecast equation → information set is fixed at date T → x is used to forecast x^(1), x^(1) is used for x^(2)… -Static forecasts have lower forecast error variance i.e. lower s.e. bands -s.e. = RMSE remains constant for static but increases for dynamic