L2 - Stationary Processes Flashcards
What is a stochastic process?
- Theta helps us decide whether the process is stationary or not (for an autoregressive process)

What are the properties of stochastic processes and how can we re-write an AR process?

What is the expected value and variance of a stochastic/AR process?
- WE assume the distribution of the error-term –> To give us the gaussian properties
- Expected value
- As the expected value of all error terms is 0, the expected value of x is 0
- Variances
- Dont include the E(X) term as it is 0
- Variance second term –> covariance of two error terms is zero
- as σE2 is a constant we can take it outside the summation. As theta is between -1 and 1 the sum will converge. So we can use the sum of an infinite series to get (a/1-θ2) where a is the constant of theta (which is 1)
- Giving us the answer we have
- Why do we get θ2 on the bottom–>
- infinite sum series is –>(1/1-ratio) where ratio is θ2(i+1)/θ2i which leads you to θ2
- Expected value

What is the Covariance of a stochastic/AR process?
- Cov(x,y) = E(XY) - E(X)E(Y)
- As E(xt) for all value of t is 0, we just look at the expected value of E(xt,xt-1)
- We are left with only the squares as the expected value of any cross products of the error is equal to 0
- Dividing through by theta in the 4th line gives use the equation for the covariance between x and its most recent past value.

What are the special notations we need to remember about stochastic/AR processes?

Example of first-order autoregression processes?
- These are the Autocorrelation (AC) functions
- Each step i, is 0.7i

What is a general definition of Stationarity?
- Not possible to test for ergodicity
- Cant wait 1000 - infinite years to test it –>have to rely on weaker properties of stationarity
- imagine it like the number of observations in a sample
- as with CLT you need a large sample set to determine processes moments but in the case of time series data we are limited by the dimension of time

What is a Strictly Stationary Stochastic process?
DIFFERENCE BETWEEN STRICTLY AND WEAKLY STATIONARY –> this bottom part is weakly stationary??
- Think of time like an infinitely long lock and at each point of time there is an infinitely long combination of outcomes but only one realised over time
- m –> moment
- Covariances are a function of the time-shift or lag k only
- so the distance between two points (or lock digits) can affect the covariance of each realisation but that is NOT affected by starting at a future time but with the same distance between two points.
- ACF –> the length and strength of the processes ‘memory’

What is Weak or Covariance Stationary Processes?
- If the error term follows a normal distribution we can say that weak stationarity is equivalent to strict stationarity
- None of the moments depend on time!

What is a weakly dependent time series?

What is the Wold Decomposition?
- Mu = deterministic and psi = stochastic

How can we represent a first-order autoregressive process AR(1)?
- (1-θL)-1
- We know that 1/1-R where R is a ratio that corresponds to the sum of a geometric series
- That is why we can convert it into 1 + θL + θ2L2….
*

How do you calculate the Sample autocorrelation coefficient?
- Still, the sample covariance divided by the sample variance
*

What are partial autocorrelations?
- can find the autocorrelation between xt and some lag k of xt without losing the effect of all lags in the middle

Example of generating Partial Autocorrelations?
- Gamma = Autocovariances
- Gamma(0) –> Variance
- rho –> auto correlations
- theta –> partial auto correlation?

What are the Yule-Walker Equations?
- MM –> Method of moments

Examples of ACF and PACF?
-
inside confidence bands –> cannot reject the null the autocorrelations are equally to 0 - rewrite this
- AR(2)
- Exponential decay in ACF
PACF 2 significant PAC
- Exponential decay in ACF
- AR(2)

Example of estimating PAC from Yule-Walker equations
- End numbers are different as PAC accounts for lag effect

What is the Q-statistics for Partial Autocorrelations?

What does a MA(1) process look like and what are its respective moments?
- V(X) –> end up with the sum of the variances and the covariances of any white noise is 0
- MA series is stationary because E(X),V(X) and covariance don’t depend on time
- Weakly dependent –> autocorrelation of order 1 is different to zero, but quickly moves to 0 after the first lag so yes it is!

What do the Yule-Walker equations look like for MA(1) processes?
- First-order MA –> have partial autocorrelation that do no cut off as fast as AR(1) processes, but their AUTOCORRELATION cut off
- AC cuts off at the point of identification of the order of the equations

Example of MA Process?
- AC cuts off after 1 lag
- +VE coefficient on the lag of error
- Oscillating PAC
- -VE coefficient on lag of error
- Exponential decline in PAC

How do you move from a MA(1) to an infinite autoregressive process?

What is the general formula and characteristics of an AR(p)?
- Theta being between -1 and 1 is no longer the stationarity condition –> need to look at the roots of the characteristic equation
- Either the roots are less than 1 in absolute value or the solution to the characteristic equals are all greater than 1 in absolute value

















