Wronged Questions: Time Series Flashcards

1
Q

Standard deviation of random walk is ______ than the differenced series

A

Larger

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Differencing the logarithmic time series will likely result in a time series that is stationary in the _____________ and ___________.

A

Mean, variance

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

A logarithmic transformation will likely result in a time series that is stationary in the ______.

A

Variance

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Differencing a time series will likely result in a time series that is stationary in the ______.

A

Mean

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Computing the differences between consecutive observations to make a non-stationary series stationary

A

Differencing

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Properties of stationary time series

A
  • properties do not depend on the time of observation
  • includes cyclic behaviour
  • constant variance, no predictable patterns
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Properties of non-stationary time series

A
  • includes time trends
  • random walks
  • seasonality
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Model with no trend

A

Yt = B_0 + e_t

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Model with linear trend

A

Y_t = B_0 + B_1t + e_t

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Model with quadratic trend

A

Yt = B0 + B1t + B2t^2 + e_t

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Two criteria for a weakly stationary model

A

1) E[Yt] does not depend on t
2) Cov[Ys, Yt] depends only on |t-s|

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

A series has strong stationarity if the entire distribution of Yt is ____ over time.

A

Constant

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

______ _______ can be used to identify stationarity

A

Control charts

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Function of a filtering procedure

A

Reduces observations to a stationary series

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Name two filtering techniques

A

1) Differencing
2) Logarithmic transformation

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

White noise process

A

Stationary process that displays no apparent patterns through time, is IID

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

T/F: For white nose process, forecasts do not depend on how far into the future we want to forecast

A

True

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

Random walk is the partial sum of a _____ ______ process.

A

White noise

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

T/F: ME and MPE detect trend patterns that are not captured by the model

A

True

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
20
Q

T/F: MSE, MAE, and MAPE can detect fewer trend patters than ME

A

False

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
21
Q

T/F: MPE and MAPE examine error relative to the actual value

A

True

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
22
Q

T/F: For AR(1), the range of possible values for p is 0<=p<=1.

A

False. For AR(1), the range of possible values for p is -1<=p<=1.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
23
Q

T/F: For AR(1), the range of possible values for B0
is 0<B0 < inf.

A

False. The range of possible values for B0 is -inf < B0 < inf.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
24
Q

T/F: For AR(1), if B1 = 1, then is a non-stationary time series.

A

True. If B1 = 0, then Yt is stationary (white noise) process. B1 =1 means it’s a random walk

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
25
Q

T/F: For AR(1), pk decreases linearly as k increases.

A

False, it decreases geometrically.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
26
Q

T/F: An AR(1) process is a meandering process.

A

False. There are cases where AR(1) is not a meandering process. Parameter B1 must be positive and significantly different from 0 (but still less than 1) for it to be a meandering process.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
27
Q

Meandering process characteristics

A
  • positive, significant autocorrelation at lag 1
  • B1 needs to be significantly different from 0
  • B1 needs to be less than 1
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
28
Q

T/F: w=1 results in no smoothing

A

False. w=0 results in no smoothing

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
29
Q

SS(w)

A

Sum of squared one step prediction error

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
30
Q

T/F: Comparing the SS(w) for different values of w can help in choosing the optimal value of w.

A

True.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
31
Q

T/F: When exponential smoothing is used for forecasting, the smoothed estimates are also called discounted least squares estimates.

A

True. Exponential smoothing can be expressed as weighted least squares. The weight used is w_t = w^(n-t), which is why the estimates are also called discounted least squares estimates.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
32
Q

T/F: All white noise processes are non-stationary.

A

False

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
33
Q

T/F: As time, t, increases, the variance of a random walk increases.

A

True

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
34
Q

T/F: First-order differencing a random walk series results in a white noise series.

A

True

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
35
Q

T/F: A white noise process is weakly stationary.

A

True, anything that is strongly stationary is also weakly stationary

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
36
Q

T/F: Adding a constant alpha to a white noise process {c1, c2, …, ct}, yt = ct + α, results in a random walk.

A

False

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
37
Q

T/F: pk cannot be negative.

A

False. pk can be negative, indicating an inverse relationship between observations k time units apart

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
38
Q

T/F: pk is only defined for stationary processes

A

False. Autocorrelation can be calculated for both stationary and non-stationary processes.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
39
Q

T/F: pk always increases as k increases.

A

False, pk typically decreases as k increases

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
40
Q

T/F: pk measures the correlation of the series with itself at different times.

A

True

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
41
Q

T/F: A process with pk > 0 for all k is non-stationary.

A

False, a process can have positive pk values and be stationary

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
42
Q

Unit root test

A

Tests whether a time series variable is non-stationary and possesses a unit root

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
43
Q

Null hypothesis of unit root test

A

Autoregressive polynomial of zt has a root equal to unity (0)?

Random walk is a good fit for the model.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
44
Q

T/F: Unit root tests are primarily used for assessing seasonality in time series data.

A

False. Unit root tests are used to assess fit of a random walk model, not seasonality.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
45
Q

T/F: A unit root evaluates the fit of a white noise process.

A

False. A unit root evaluates the fit of a random walk model, not the white noise process.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
46
Q

T/F: The Dickey-Fuller test is used to detect the presence of a unit root in the time series.

A

True. The Dickey-Fuller test specifically tests for a unit root, helping to determine if a random walk model is a good fit for the observed data.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
47
Q

T/F: Unit root tests confirm the absence of volatility clustering in financial time series.

A

False. Unit root tests assess the fit of a random walk model, not volatility clustering.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
48
Q

T/F: If Φ = 1, then the model reduces to a stationary process.

A

False. If Φ = 1, then the model reduces to a random walk, which is nonstationary.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
49
Q

T/F: The moving average technique cannot be used for forecasting.

A

False. Moving averages can be used for forecasting.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
50
Q

T/F: Moving averages typically assign higher weights to older observations

A

False. Moving averages typically assign equal weights to all observations

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
50
Q

T/F: Moving averages are easy to compute but challenging to interpret.

A

False. Moving averages are both easy to compute and easy to interpret.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
51
Q

T/F: Moving averages can be expressed as weighted least squares estimates.

A

True. Moving averages can also be expressed as weighted least squares estimates where recent observations within the window are given higher weights than observations that are not in the window.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
52
Q

T/F: If the moving average estimate at time t is based on the latest k observations up to and including time t, a larger k results in less smoothing of the time series.

A

False. The choice of k will depend on the amount of smoothing desired; the larger the value of k, the smoother the estimate will be (because more averaging is done).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
53
Q

T/F: If bar(y) =/= 0 for white noise process, then the time series is nonstationary in the mean.

A

False. A white noise process is stationary; therefore, it is always stationary in the mean.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
54
Q

T/F: If s^2 > 0 for a white noise process, then the time series is nonstationary in the variance

A

False. A white noise process is stationary; therefore, it is always stationary in the variance.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
55
Q

T/F: Exponential smoothing cannot handle data with a linear trend.

A

False. Exponential smoothing can be adapted for trends. If there is a linear trend in time, this can be handled using double exponential smoothing.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
56
Q

T/F: The initial value of the series, y_0, must be chosen to be 0.

A

False. Y_0 can be chosen to be either 0, y1, or bar(y), not always 0 for simplicity.

57
Q

T/F: Exponential smoothing gives equal weights to all past observations.

A

False. Exponential smoothing gives exponentially decreasing weights to older observations.

58
Q

T/F: The weight in exponential smoothing is traditionally chosen between 0.1 and 0.3.

A

False. W is traditionally chosen to be within the interval (0.70, 0.95).

59
Q

T/F: Goodness of fit for exponential smoothing can be assessed using the sum of squared one-step prediction errors.

A

True

60
Q

T/F: The time series is assumed to follow a stationary process under the null hypothesis. (For unit root test?)

A

False. The time series is assumed to follow a random walk (non-stationary) process under the null hypothesis.

61
Q

T/F: The test statistic does not follow the usual t-distribution.

A

True. The test statistic does not follow the usual t-distribution. Rather, it follows a special distribution with critical values developed by Dickey and Fuller.

62
Q

T/F: The Dickey-Fuller test is a two-tailed hypothesis test.

A

False. The Dickey-Fuller test is a left-tailed (i.e., one-tailed) test that has the following null and alternative hypotheses:

H0: Φ = 1
H1: Φ < 1

63
Q

T/F: The disturbance term in the model is assumed to be serially correlated.

A

False. The disturbance term in the model, is assumed to be serially uncorrelated.

64
Q

T/F: If the test statistic is statistically significant, we conclude the random walk model is a good fit.

A

False. If the test statistic is statistically significant, we reject the null hypothesis. Rejecting the null hypothesis means that we conclude the random walk model is not a good fit.

65
Q

T/F: Control charts are used to detect non-stationarity in a time series.

A

True. Control charts are useful graphical tools for detecting trends and identifying unusual points. They detect nonstationarity in a time series.

66
Q

T/F: A control chart has superimposed lines called reference limits.

A

False. Control charts have superimposed lines called control limits. Two well-known control limits are the upper control limit and the lower control limit.

67
Q

T/F: An R chart examines the stability of the mean of a time series.

A

False. An R chart helps examine the stability of the variability of a time series.

68
Q

Y_n+l for white noise model

A
69
Q

se y_(n+l) for white noise model

A
70
Q

Wt for random walk forecast

A
71
Q

Y_n+l for random walk model

A
72
Q

se y_(n+l) for random walk model

A
73
Q

Mean Error (ME)

A
74
Q

Mean Percent Error (MPE)

A
75
Q

Mean Squared Error (MSE)

A
76
Q

Mean Absolute Error (MAE)

A
77
Q

Mean Absolute Percent Error

A
78
Q

E[Yt] for AR(1) model

A
79
Q

Var[Yt] AR(1) model

A
80
Q

Pk for AR(1) model

A
81
Q

b1 estimation for AR(1)

A

r1 (autocorrelation lag 1)

82
Q

b0 estimation for AR(1)

A
83
Q

s^2 for AR(1) model

A
84
Q

hat(Var)[Yt]

A
85
Q

se y_(n+l) for AR(1) model

A
86
Q

Single smoothing with moving average

A
87
Q

b0 prediction for single smoothing with moving average

A
87
Q

Yt for single smoothing with moving average

A
88
Q

Yt for double smoothing with moving average

A
89
Q

Double smoothing with moving average

A
90
Q

b0 prediction for double smoothing with moving average

A
91
Q

b1 prediction for double smoothing with moving average

A
92
Q

Y_n+l for double smoothing with moving average

A
93
Q

Yt for single exponential smoothing

A
94
Q

b0 prediction for single exponential smoothing

A
95
Q

hat(S)_t for single exponential smoothing

A
96
Q

Yt for double exponential smoothing

A
97
Q

hat(S)_t for double exponential smoothing

A
98
Q

b0 prediction for double exponential smoothing

A
99
Q

b1 prediction for double exponential smoothing

A
100
Q

Y_n+l for double exponential smoothing

A
101
Q

Seasonal Autoregressive Models, SAR(p)

A
102
Q

ARCH(p) model

A
103
Q

Var[ε_t] for ARCH(p) model

A
104
Q

GARCH(p) model

A
105
Q

Var[ε_t] for GARCH(p) model

A
106
Q

T/F: k = 1 results in no smoothing.

A

True. The larger the k, the smoother the estimate.

107
Q

T/F: It is risky to choose a small value for k because we may lose sight of the real trends due to oversmoothing.

A

False. A small k doesn’t cause oversmoothing

108
Q

T/F: When smoothing with moving averages is used for forecasting, the model is called a globally constant mean model.

A

False. When smoothing with moving averages is used for forecasting, the model is called a locally constant mean model.

109
Q

Autocorrelation Standard Error

A
110
Q

Autocorrelation test statistic

A
111
Q

T/F: Seasonal effects in time series can be captured using categorical variables.

A

True

112
Q

T/F: Both Mean Error and Mean Absolute Error statistics can detect more trend patterns than Mean Square Error can.

A

False. MSE, MAPE, and MAE can detect more trend patterns than ME.

113
Q

T/F: GARCH models assume a constant mean level for the series.

A

True

114
Q

T/F: GARCH models are used to model the conditional variance of the series.

A

True

115
Q

T/F: Volatility clustering can be modeled with GARCH models.

A

True

116
Q

T/F: GARCH models predict volatility based solely on long-term volatility parameters.

A

False. GARCH models predict volatility based on both long-term volatility parameters (the intercept) and short-term effects from past variances and past squared residuals, not solely on long-term parameters.

117
Q

T/F: A weakly stationary process is not stationary in the variance.

A

False. A weakly stationary process is stationary in the variance

118
Q

T/F: Applying logarithmic transformation to a series can help stabilize its mean.

A

False. Applying logarithmic transformation to a series can help stabilize its variance.

119
Q

T/F: The stochastic component of a linear trend in time model is non-stationary.

A

False. The stochastic component of a linear trend in time model is a stationary white noise process, while the stochastic component of a random walk model is itself a non-stationary random walk process (sum of white noise processes).

120
Q

T/F: The sample variance of a white noise process is greater than the sample variance of the differenced series of a white noise process.

A

False. The sample variance of a white noise process is smaller than the sample variance of the differenced series of a white noise process.

121
Q

T/F: ARCH and GARCH models predict the mean level of the series without considering volatility.

A

False. ARCH/GARCH models specifically focus on modeling time-varying volatility.

122
Q

T/F: ARCH models assume constant conditional variance across all time periods.

A

False. ARCH models allow conditional variance to change over time, contrary to the assumption of constant unconditional variance.

123
Q

T/F: GARCH models allow the variance of residuals to depend on past squared residuals and past variances.

A

True.

124
Q

T/F: Volatility clustering is a phenomenon best modeled by fixed seasonal effects models.

A

False. Volatility clustering is addressed by ARCH/GARCH models, not by seasonal models.

125
Q

Phenomenon where periods of high volatility tend to be followed by more periods of high volatility, and periods of low volatility tend to be followed by more periods of low volatility.

A

Volatility clustering

126
Q

T/F: Seasonal autoregressive models use a fixed seasonal effect that does not change over time.

A

False. Seasonal autoregressive models account for seasonality by including terms for specific seasonal lags.

127
Q

T/F: Fixed seasonal effects models cannot represent seasonality as a trigonometric function of time.

A

False. Fixed seasonal effects models can and often do use trigonometric functions to represent seasonal effects.

128
Q

T/F: Seasonal smoothed exponential models allow for the seasonal component to adapt over time.

A

True. Seasonal smoothed exponential models, like the Holt-Winter additive model, adapt the seasonal component based on past data, allowing it to change over time.

129
Q

T/F: The Holt-Winter additive model ignores seasonality in its forecasts.

A

False. The Holt-Winter additive model explicitly includes a seasonal component in its forecasts.

130
Q

T/F: SAR(P) models include observations from non-seasonal periods to model seasonality.

A

False. SAR(P) models specifically focus on seasonal lags, not non-seasonal periods.

131
Q

T/F: Conditional least squares can be used to estimate
and β1 and β2 for AR(1).

A

True

132
Q

T/F: The residuals are calculated as yt + (b0 + b1yt-1) for AR(1).

A

False. The residuals are calculated as yt - (b0 + b1yt-1).

133
Q

T/F: Forecast intervals remain constant regardless of the number of steps ahead for AR(1).

A

False. Forecast intervals widen as the number of steps into the future increases.

134
Q

T/F: For stationary AR(1) model, the variance of the residual is greater than the variance of the time series model.

A

False. The variance of the residual is less than the variance of the time series model.

135
Q

T/F: If β1 = 0, then yt is a random walk model.

A

False. If β1 = 0, then yt is a white noise model.

136
Q

T/F: For a stationary AR(1) model, the lag k autocorrelation converges to 0 as k increases.

A

True

137
Q

Mean of random walk

A
138
Q

Variance of random walk

A