5. ARMA Time Series Flashcards

1
Q

What are the conditions for weak stationarity in a time series?

A

E(Xt) = µ (constant mean), Var(Xt) = γ(0) (constant variance), Cov(Xt, Xs) = γ(h) (depends only on lag h, not on time t or s).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What is the formula for autocorrelation in a time series?

A

ρ(h) = γ(h) / γ(0), where γ(h) is the autocovariance at lag h and γ(0) is the variance.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Write the AR(1) model equation and its stationarity condition.

A

AR(1): Yt = ϕ * Yt-1 + εt; Stationarity condition: |ϕ| < 1.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What is the difference operator, and what does it do?

A

The difference operator is ∇Xt = Xt - Xt-1, and it transforms a non-stationary series into a stationary one by removing trends.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

State the formula for the variance of an MA(1) model.

A

Var(Yt) = σ²ε * (1 + θ²), where θ is the MA(1) parameter and σ²ε is the variance of the noise term.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

How do you interpret the autocorrelation function (ACF) of an AR(1) process?

A

The ACF of an AR(1) process decays exponentially as the lag increases, with ρ(h) = (ϕ^h), where ϕ is the autoregressive coefficient.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

What is the purpose of the Dickey-Fuller test in time series analysis?

A

The Dickey-Fuller test is used to test for stationarity by checking if a time series has a unit root (indicating non-stationarity).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Explain the stationarity condition for a general AR(p) model.

A

The stationarity condition for an AR(p) model requires the roots of the characteristic equation Φ(z) = 1 - Σ(ϕj * zj) to lie outside the unit circle in the complex plane.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

What is the significance of the moving average (MA) coefficients in the autocovariance function?

A

MA coefficients determine the autocovariance up to the lag equal to the MA order (q), beyond which the autocovariance is zero.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Why is differencing used in ARIMA models, and how is it applied?

A

Differencing is used to make a non-stationary series stationary by removing trends or seasonality, applied as ∇Xt = Xt - Xt-1 or higher-order differences like ∇²Xt = Xt - 2Xt-1 + Xt-2.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

How can the autocorrelation and partial autocorrelation functions (PACF) help identify the order of an AR(p) model?

A

The ACF of an AR(p) model tails off gradually, while the PACF cuts off sharply after lag p, indicating the model order.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Describe how the Box-Jenkins methodology is applied in building ARIMA models.

A

The Box-Jenkins methodology involves model identification (using ACF and PACF), parameter estimation (e.g., maximum likelihood), and model diagnostics (checking residuals for white noise and model fit).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Explain the conditions under which the MA(1) model exhibits invertibility and its importance.

A

An MA(1) model is invertible if |θ| < 1. Invertibility ensures a unique representation of the model in terms of past observations and errors.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

How does the maximum likelihood estimation method work in ARIMA models?

A

Maximum likelihood estimation finds parameter values that maximize the likelihood of observing the given data, assuming the errors follow a Gaussian white noise process.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Discuss the impact of overfitting in ARMA/ARIMA models and how it can be avoided.

A

Overfitting occurs when too many parameters are included, leading to poor generalization. It can be avoided using criteria like AIC or BIC for model selection and cross-validation for forecasting accuracy.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

What does the white noise assumption imply about its mean, variance, and covariance?

A

White noise has a mean of 0, a constant variance (σ²), and zero covariance for all non-equal time points.

17
Q

State the weak stationarity conditions for a time series.

A

Weak stationarity requires a constant mean (E(Xt) = µ), constant variance (Var(Xt) = γ(0)), and autocovariance depending only on the lag (γ(h)).

18
Q

Write the formula for autocovariance and explain its components.

A

γ(h) = Cov(Xt, Xt+h), where Xt and Xt+h are values of the series at times t and t+h, and γ(h) measures the linear relationship at lag h.

19
Q

How is autocorrelation defined, and why is it useful?

A

ρ(h) = γ(h) / γ(0); it measures the relative strength of autocovariance at lag h, helping identify patterns and dependencies in a time series.

20
Q

Write the AR(1) model and its variance formula.

A

AR(1): Yt = ϕ * Yt-1 + εt; Var(Yt) = σ²ε / (1 - ϕ²), assuming |ϕ| < 1.

21
Q

What is the stationarity condition for a general AR(p) model?

A

The roots of the polynomial Φ(z) = 1 - Σ(ϕj * zj) must lie outside the unit circle.

22
Q

Write the equation for an MA(1) model and its autocovariance at lag 1.

A

MA(1): Yt = εt - θ * εt-1; γ(1) = -θ * σ²ε.

23
Q

Describe the general structure of an ARMA(p, q) model.

A

ARMA(p, q): Φ(L) * (Yt - µ) = Θ(L) * εt, where Φ(L) and Θ(L) are polynomials in the lag operator L for AR and MA parts, respectively.

24
Q

Define the difference operator and its purpose in ARIMA models.

A

The difference operator is ∇Xt = Xt - Xt-1, used to transform a non-stationary time series into a stationary one by removing trends.

25
Q

What does the invertibility condition for an MA(q) model ensure?

A

The invertibility condition ensures a unique representation of the model in terms of past observations and errors, requiring the roots of Θ(z) = 0 to lie outside the unit circle.

26
Q

State the formula for the expectation of the AR(1) process when centered.

A

E(Yt) = 0 if the process is centered.

27
Q

What is the formula for the characteristic equation in an AR(p) model?

A

Φ(z) = 1 - Σ(ϕj * zj), and stationarity requires its roots to lie outside the unit circle.

28
Q

Explain the difference between weak white noise and strict white noise.

A

Weak white noise assumes constant mean, variance, and zero autocovariance, while strict white noise additionally assumes independence across all time points.

29
Q

What is the relationship between the ACF and PACF for an MA(q) model?

A

For an MA(q) model, the ACF cuts off after lag q, while the PACF tails off gradually.

30
Q

Write the formula for a second-order difference operator and explain its use.

A

∇²Xt = Xt - 2Xt-1 + Xt-2; used to remove quadratic trends in non-stationary data.

31
Q

Describe how the Dickey-Fuller test assesses stationarity.

A

The Dickey-Fuller test checks if a unit root exists by testing the null hypothesis that ϕ = 1 in the model ∇Yt = ϕYt-1 + εt.

32
Q

Write the equation for the variance of an MA(1) model.

A

Var(Yt) = σ²ε * (1 + θ²), where θ is the MA(1) parameter and σ²ε is the variance of white noise.

33
Q

How is the forecast error variance calculated for a one-step forecast in ARIMA models?

A

The one-step forecast error variance equals the variance of the residuals, typically σ²ε if the model is correctly specified.

34
Q

What are the key steps in the Box-Jenkins methodology for ARIMA models?

A

Identification (using ACF/PACF), estimation (e.g., maximum likelihood), and diagnostics (residual analysis and model validation).

35
Q

Write the general formula for the ARIMA(p, d, q) model.

A

ARIMA(p, d, q): Φ(L) * ∇^d(Yt - µ) = Θ(L) * εt, where d is the order of differencing.