2. Time Series Econometrics Flashcards
What is a time series process?
A set of temporally ordered observations on a variable y taken at equally spaced discrete intervals in time
What has to be true for a stochastic process, y, to be covariance stationary?
Each yt has the same mean and variance, and the covariance between yt and yt-1 depends only on the separation, not on t
What does strict stationarity require?
The joint PDF of yt-s, yt-s+1, …, yt is identical to yt-s+k, yt-s+k+1,…, yt+k
What is a disadvantage of the autocovariance function?
It isn’t scale invariant
What is the Autoregressive Moving Average process (ARMA)
A linear function of lags of itself and of Ęt and lags
What are some properties of Ęt?
-E(Ęt)=0
-V(Ęt)= ó^2
-C(Ęt, Ęt-1) =0
Hence Ęt is itself a covariance stationary process called white noise
What do the mean, variance and autocovariance of the MA(q) process depend on?
They don’t depend on t, therefore the process is stationary for any value of theta
When is the AR(p) process stationary?
When |phi| <1 since the variances and autocovariances don’t depend on t when this is the case
What happens to the non-zero autocorrelations in the AR(1) process when S increases?
The autocorrelations decay to zero because |phi|<1
What does the lag operator do?
Lxt = xt-1
It shifts the period back one
How can we use the lag operator?
As a simple way of moving between MA and AR representations of the ARMA process
Which ARMA(p,q) processes can be written as MA(infinity) or AR(infinity)?
Any stationary and invertible ARMA(p,q)
What does white noise refer to?
A process whose autocorrelations are zero at all logs
If an ARMA(2,1) process has identical autocorrelations to the AR(1) process, what can we say about the ARMA(2,1) process?
It is overparameterised
Equation for random walk
yt= yt-1 +Et
What is f(yt | Yt-1)?
The density of yt given that we know everything up to time t-1
What is lnf(Yt) when y1 isn’t fixed?
The prediction error decomposition form of the log likelihood function
What is lnf(Yt) when we fix y1?
The conditional log likelihood function
What is the CSS?
The conditional sum of squares. It squares then sums the deviation of each yt from its conditional mean phi yt-1
How can we make finding the MLE easier?
We can find the simpler conditional log likelihood function by treating y1 as fixed