L2 - Time series Flashcards
What is a stochastic process and its associated notiation?
A stochastic process is a sequence of random variables
indexed by time.
We will use the notation {yₜ: 1, 2, . . .}, or {yₜ} to denote this sequence
How do we represent the mean (µ)?
E[yₜ]
How do we represent the variance (γ₀)?
E[(yₜ - µ)²]
How do we represent the autocovariance (γⱼ)?
E[(yₜ - µ)(yₜ ₋ ⱼ - µ)]
In order to ‘operationalise’ the assumption that the future behaves like the past, what do we need to do?
We impose restrictions on autocovariances (Stationarity).
What is Stationarity?
A stationary time series process is a process whose probability distributions are stable over time
What does it mean to be weekly stationary?
mean, variance and covariances are stable:
E[yₜ] = µ < ∞
Var(yₜ) = γ₀ < ∞
Cov(γⱼ, yₜ ₋ ⱼ) = γⱼ
Why do we need to impose stability in a stationary process?
If the relationship between y and x keeps changing randomly over time, we won’t be able to understand how a change in x affects y
What is the equation of standardised autocovariances (ACF/correlogram)
ρⱼ = Cov(γⱼ, yₜ ₋ ⱼ) / Var(yₜ) = γⱼ / γ₀
−1 ≤ ρⱼ ≤ 1
What is a ‘White Noise’ process
A type of time series that is not predictable. It must:
E[εₜ] = 0 for all t
E[εₜεₛ] = σ² for t = s
= 0 for t ≠ s
What does it mean to be ‘serially uncorrelated’
A stochastic process with zero correlation across time periods
One examples of a stationary process is the ‘AR’ model.
Model an AR(1) process.
yₜ = θyₜ₋₁ + εₜ
where εₜ is white noise
Model an AR(3) process
yₜ = θ₁yₜ₋₁ + θ₂yₜ₋₂ + θ₃yₜ₋₃ + εₜ
where εₜ is white noise
One examples of a stationary process is the ‘MA’ model.
Model an MA(1) process.
yₜ = εₜ + αεₜ₋₁
where εₜ is white noise
Model an MA(3) process.
yₜ = εₜ + αεₜ₋₁ + αεₜ₋₂ + αεₜ₋₃
where εₜ is white noise
We can combine the two stationary processes.
Model an ARMA(1,1)
yₜ = θ₁yₜ₋₁ + εₜ + αεₜ₋₁
What is correlogram formula for an AR(1) process?
γⱼ = Cov(γⱼ, yₜ ₋ ⱼ) = θʲ [σ²/(1- θ)]²
Under what assumptions is OLS valid in time series?
A1- The stochastic process (DTP) follows yₜ = β₀ + β₁x₁ₜ + . . . + βkxₜkₜ + uₜ
A2 - No Perfect Collinearity
A3 - Zero Conditional Mean such that E[uₜ| X] = 0
A4 - Homoscedasticity Var(uₜ|X) = σ² , t = 1, . . . , T
Under what 3 assumptions is βˆ an unbiased esitimator
A1- The stochastic process (DTP) follows yₜ = β₀ + β₁x₁ₜ + . . . + βkxₜkₜ + uₜ
A2 - No Perfect Collinearity
A3 - Zero Conditional Mean such that E[uₜ| X] = 0
What is strong exogeneity?
Every regressors is uncorrelated with the error term in every time period
Under which specific assumption is βˆ is the best linear unbiased estimator (BLUE) of β
Homoskedasticity
Show that a model with a lagged dependent variable cannot satisfy strong exogeneity
E[yₜuₜ]) = σ²
This means they are not uncorrelated.
What is weak dependence?
A weakly stationary time series is weakly dependent if
the correlation between yₜ and yₜ₊ⱼ goes to zero ‘sufficiently quickly’ as j goes to infinity
Why do we care about weak dependence?
The role of the weak dependence assumption is to allow us to invoke the Central Limit Theorem and the Law of Large Numbers in the presence of time series data.
It basically lets us treat our data as i.i.d
What is the ‘contemporaneous exogeneity assumption’?
The relationship between independent variables and the error term in the same period are 0.
Asymptotically, what assumptions are required to make the OLS estimator consistent?
Weak dependence, No Perfect Collinearity, contemporaneous exogeneity
Asymptotically, what assumptions are required to make the OLS estimator asymptotically normally distributed?
Weak dependence, No Perfect Collinearity, contemporaneous exogeneity and Homoskedasticity