L2 - Stationary Processes Flashcards
What is a stochastic process?
- Theta helps us decide whether the process is stationary or not (for an autoregressive process)
What are the properties of stochastic processes and how can we re-write an AR process?
What is the expected value and variance of a stochastic/AR process?
- WE assume the distribution of the error-term –> To give us the gaussian properties
- Expected value
- As the expected value of all error terms is 0, the expected value of x is 0
- Variances
- Dont include the E(X) term as it is 0
- Variance second term –> covariance of two error terms is zero
- as σE2 is a constant we can take it outside the summation. As theta is between -1 and 1 the sum will converge. So we can use the sum of an infinite series to get (a/1-θ2) where a is the constant of theta (which is 1)
- Giving us the answer we have
- Why do we get θ2 on the bottom–>
- infinite sum series is –>(1/1-ratio) where ratio is θ2(i+1)/θ2i which leads you to θ2
- Expected value
What is the Covariance of a stochastic/AR process?
- Cov(x,y) = E(XY) - E(X)E(Y)
- As E(xt) for all value of t is 0, we just look at the expected value of E(xt,xt-1)
- We are left with only the squares as the expected value of any cross products of the error is equal to 0
- Dividing through by theta in the 4th line gives use the equation for the covariance between x and its most recent past value.
What are the special notations we need to remember about stochastic/AR processes?
Example of first-order autoregression processes?
- These are the Autocorrelation (AC) functions
- Each step i, is 0.7i
What is a general definition of Stationarity?
- Not possible to test for ergodicity
- Cant wait 1000 - infinite years to test it –>have to rely on weaker properties of stationarity
- imagine it like the number of observations in a sample
- as with CLT you need a large sample set to determine processes moments but in the case of time series data we are limited by the dimension of time
What is a Strictly Stationary Stochastic process?
DIFFERENCE BETWEEN STRICTLY AND WEAKLY STATIONARY –> this bottom part is weakly stationary??
- Think of time like an infinitely long lock and at each point of time there is an infinitely long combination of outcomes but only one realised over time
- m –> moment
- Covariances are a function of the time-shift or lag k only
- so the distance between two points (or lock digits) can affect the covariance of each realisation but that is NOT affected by starting at a future time but with the same distance between two points.
- ACF –> the length and strength of the processes ‘memory’
What is Weak or Covariance Stationary Processes?
- If the error term follows a normal distribution we can say that weak stationarity is equivalent to strict stationarity
- None of the moments depend on time!
What is a weakly dependent time series?
What is the Wold Decomposition?
- Mu = deterministic and psi = stochastic
How can we represent a first-order autoregressive process AR(1)?
- (1-θL)-1
- We know that 1/1-R where R is a ratio that corresponds to the sum of a geometric series
- That is why we can convert it into 1 + θL + θ2L2….
*
How do you calculate the Sample autocorrelation coefficient?
- Still, the sample covariance divided by the sample variance
*
What are partial autocorrelations?
- can find the autocorrelation between xt and some lag k of xt without losing the effect of all lags in the middle
Example of generating Partial Autocorrelations?
- Gamma = Autocovariances
- Gamma(0) –> Variance
- rho –> auto correlations
- theta –> partial auto correlation?
What are the Yule-Walker Equations?
- MM –> Method of moments