a time series Flashcards

1
Q

what is the definition of a time series? What is the realization(sample path) and original random variable sequences? Why use time series? Difference type of Time Series(univariate and multivariate)?

A
  1. time series definition: a sequence of observations of a stochastic process over time, usually index by t;
    stochastic process definition: sequence of random variables;
  2. realization/sample path: {y_t}^T_t=1 = {y_1, y_2, …, y_T};
    Random Variable itself: {Y_t}^T_t=1 = {Y_1, Y_2, …, Y_T};
  3. The reason use time series: sometimes when it is not iid, use time series
  4. univariate time series: y_t belong to R;
    multivariate time series: y_t belong to R^k;
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Definition of Strict Stationarity?
Definition of Weak/Covariance Stationarity?
Definition of the jth order autocovariance(gamma j) ?

A
  1. Definition of Strict Stationarity?- for any value j1, j2, …, jn, the joint distribution of (Y_t, Y_t+j1, Y_t+j2, …, Y_t+jn) only depend on the intervals separating j1, j2, …, jn but NOT on t.
  2. Definition of Weak/Covariance Stationarity?- if (1) E(Y_t)= u independent of t, and (2) the jth order autocovariance(gamma j) exists, finite, and depends only on j but is independent of t.
  3. Definition of the jth order autocovariance(gamma j) : Cov(Y_t, Y_t-j = E[(Y_t - E(Y_t))(Y_t-j - E(Y_t-j))] )
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Q&A about Stationarity

  1. For covariance stationarity process, can we say something about gamma_-j if we know gamma_j ?
  2. Does strict stationarity imply covariance stationarity?
  3. Does covariance stationarity imply strict stationarity?
A
  1. gamma_-j = gamma_j
  2. No, except under condition that the first two moments are finite, then yes.
  3. No, for example, a process has time-invariant mean and autocovariances, but time-varying third moment. But there is special case when it does, if the process is Gaussian process, that is, {Y_t, Y_t+j1, Y_t+j2, …, Y_t+jn} is Gaussian for any j1, j2, …, jn.
  4. definition of Gaussian process: Gaussian process, that is, {Y_t, Y_t+j1, Y_t+j2, …, Y_t+jn} is Gaussian for any j1, j2, …, jn.
    property of Gaussian process: a multivariate normal distribution is completely characterized by its mean and covariance matrix.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

4 types of asymptotics? Extend to matrix form? Relationship of four types?

A
  1. 4 types of asymptotics: (1) Convergence in probability (2) Almost sure convergence (3) Convergence in mean square (4) Convergence in Distribution
  2. (1) Convergence in probability (2) Almost sure convergence (3) Convergence in mean square : all these definitions extend to matrices(vectors) by requiring element-by-element convergence.
    but element-by-element convergence in (4)Convergence in Distribution doesn’t hold for matrix form.
  3. (2) Almost sure convergence and (3) Convergence in mean square implies (1) Convergence in probability; (1) Convergence in probability implies (4)Convergence in Distribution
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Definition of Ergodicity

A

Ergodicity: vertical average = horizontal average

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Definition of Ergodicity for the mean?
Definition of Ergodicity for the second moments?

other properties:

A

Definition of Ergodicity for the mean? if 1/T sum over t y_t = E(Y_t)
Definition of Ergodicity for the second moments? if 1/T-j sum over t (y_t - u)(y_t-j - u) = gamma_j for all j

other properties:

  1. a stationary process is ergodic for the mean if sum over j |gamma_j| < infinity
  2. if {Y_t} is a Gaussian process, then the absolute summability of autocovariances implies that it is ergodic for all moments.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Definition of White Noise Process?
Definition of Independent White Noise Process?
Definition of Independent and Identically Distributed White Noise Process?
Definition of Gaussian White Noise Process?

A

Definition of White Noise Process?

  • E(epsilon_t) = 0 and E(epsilon^2) = sigma^2 and E(epsilon_t*epsilon_tau) = 0 for t!=tau is called white noice
  • Notation: epsilon ~ WN(0, sigma^2)

Definition of Independent White Noise Process?

  • if additionally, epsilon_t and epsilon_tau are independent for t != tau, then it is an independent white noice process
  • note: cov(epsilon_t*epsilon_tau)=0 doesn’t mean independence, they could be correlated in the second moment

Definition of Independent and Identically Distributed White Noise Process?

  • if additionally, epsilon_t is identically distributed, the it is iid white noise process
  • Notation: epsilon ~ iid WN(0, sigma^2)

Definition of Gaussian White Noise Process?

  • if additionally, epsilon_t ~ N(0, sigma^2), then it is a Gaussian white noise process
  • Notation: epsilon ~ iid N(0, sigma^2)
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Definition of MA(1)?

A

Definition of MA(1)?

Y_t = u + theta_1*epsilon_t-1 + epsilon_t
where u and theta_1 are constant scalars and epsilon_t is white noise, epsilon_t~WN(0, sigma^2)

Y_t is a first-order moving average process, or MA(1)

Means: E(Y_t) = E(u + theta_1*epsilon_t-1 + epsilon_t) = u [doesn’t depend on time]
Variances: E[(Y_t - u)^2] = (theta_1^2 +1) * sigma^2
First-order autocovariance: gamma1 = theta_1 * sigma^2 [doesn’t depend on time]
From second-order autocovariance: gamma_j = 0 for j>=2 [doesn’t depend on time]

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Q&A about MA(1)?

  1. Is an MA(1) process stationary?
  2. Is it ergodic for the mean?
  3. Can you think of a condition that implies ergodicity for the second moments?

Property of MA(1)?

A

Y_t = u + theta_1*epsilon_t-1 + epsilon_t
1. Is an MA(1) process stationary?
weak stationary. yes, mean and autocovariances are finite and time-invariant.
2. Is it ergodic for the mean?
yes, summary of gamma j is finite
3. Can you think of a condition that implies ergodicity for the second moments?

Property of MA(1):
MA(1) is weak stationary, and ergodic for the mean,
if {epsilon_t} is a Gaussian white noise, then MA(1) is also ergodicity for the second moments.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Definition of Autocorrelation?

Autocorrelation vs Autocovariance?

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Definition of MA(q)?

Mean:
Variannce:
jth order autocovariance:
jth order autocovariance(for j > q), gamma_j = 0

Property of MA(q)?
why MA(q) always stationary for the mean?
why MA(q) always ergodic for the mean?
If epsilon_t is Gaussian White Noise, it is ergodic for all moments.

A

Definition of MA(q)?
Y_t = u + theta_1epsilon_t-1 + theta_2epsilon_t-2 + … + theta_q*epsilon_t-q + epsilon_t
where u, theta_1, theta_2, …, theta_q are constant scalars, and epsilon_t ~ WN(0, sigma^2)

Mean:
Variannce:
jth order autocovariance:
jth order autocovariance(for j > q), gamma_j = 0

Property of MA(q)?
why MA(q) always stationary for the mean?
-doesn’t depend on the time, all finite
why MA(q) always ergodic for the mean?
the sum of … is finite
If epsilon_t is Gaussian White Noise, it is ergodic for all moments.
property that used: if {Y_t} is Gaussian process, then the absolute summability of autocovariances implies that it is ergodic for all moments.
MA(q) is always stationary and ergodic for the mean, if epsilon_t is Gaussian WN, then ergodic for all moments.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Definition of MA(infinity)?NOTICE MA(infinity) show up very often, have some surprising property

Mean:
Variannce:
jth order autocovariance:
jth order autocovariance(for j > q), gamma_j = 0

Property of MA(infinity)?

A

Definition of MA(infinity)?

Mean:
Variannce:
jth order autocovariance:
jth order autocovariance(for j > q), gamma_j = 0

Property of MA(infinity)?
if MA coefficients are absolutely summable, then all previous results for MA(q) hold.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Definition of AR(1)?

Y_t = c + fi_1* Y_t-1 + epsilon_t
where epsilon_t ~ WN(0, sigma^2)

Explain why AR(1) can be rewritten as an MA(infinity)?

The property of AR(1)?[we want to show AR(1) is MA(infinity) so have MA(innfinity) property]

Why |fi_1| need to be < 1 ?

A

Explain why AR(1) can be rewritten as an MA(infinity)?
-first approach: recursive backward substitution

The property of AR(1)?
if |fi_1| < 1, the AR(1) process is stationary and ergodic for the mean
if epsilon_t is Gaussian white noise additionally, AR(1) is ergodic for all moments

Note: |fi_1| < 1 is crucial for these properties
Note: if |fi_1| > 1, we need to use forward substitution

Why |fi_1| need to be < 1 ?

  1. you will get something negative for variance
  2. the mean is negative for the positive c if |fi_1| > 1
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Autocorrelation of White Noise, MA(1), and AR(1)?

A

Autocorrelation of White Noise: first-auto correlation is zero
Autocorrelation of MA(1): first-auto correlation is not zero, other-auto correlation is zero
Autocorrelation of AR(1)? other-auto correlation is not zero

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Definition of AR(p)?

Derivating the property of AR(p) is cumbersome, what tools could we use?

A

Definition of AR(p)?
Y_t = c + fi_1* Y_t-1 + fi_2Y_t-2 + fi_3 Y_t-3 + … + fi_p* Y_t-p + epsilon
where epsilon_t ~ WN(0, sigma^2)

Derivating the property of AR(p) is cumbersome, what tools could we use?

  1. difference equation
  2. lag operator
How well did you know this?
1
Not at all
2
3
4
5
Perfectly