midterm 3 Flashcards

1
Q

temporal ordering

A

past values can influence future values, but not vice versa. In time series analysis, this distinguishes it from cross-sectional data.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

data frequency

A

the interval at which data points are collected: annual, quarterly, monthly, daily

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

static model

A

a regression mode where the effect of the explanatory variable Z on the dependent one is immediate, with no lagged effects.

yt=beta0 + beta1Zt + ut

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

finite distributed lag model FDL

A

a regression model that includes both current and lagged values of an explanatory variable to capture delayed effect

yt = alfa0 + delta0Zt + delta1Zt-1 + ut

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

order of FDL

A

the number of lagged values of the explanatory variable included in an FDL model.
if its 2 it means Zt, Zt-1, Z-2

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

impact propensity/ impact multiplier

A

the immediate effect of a one-unit change in the explanatory variable Z on the dependant variable

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

long-run propensity

A

the total cumulative effect of a permanent one-unit change in Z on y over time

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

trending time series

A

a time series that shows a consistent upward or downward movement over time. like GDP

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

spurious regression

A

occurs when we ignore a trend, and suggest a relationship when there is none. Leads to biased estimation. Affect both the dependent and independent. Adding the appropriate time trend eliminates the problem

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

linear trend

A

steady increase or decrease
trendt= alfa0 + alfa1t

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

quadratic trend

A

u shaped, accelerating or decelerating
trend= alfa0 + alfa1t + alfa2t”2

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

exponential trend

A

rapid (%) growth
trendt= alfa0e”alfa1t

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

detrended variables

A

removing long-term trends tp focus on short-term variations. Helps avoid spurious regression result.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

hump-shaped (inverted u shape) trend

A

a trend where a variable initially increases, reaches a peak and then decreases.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

u shaped trend

A

a trend where a variable decrease initially, hits a min, and then increases

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

quarterly data

A

time series data collected every 3 months

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

seasonality

A

recurring patterns in a time series tied to specific times of the year

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

deseasonalised data

A

data adjusted to remove seasonal effects, making it easier to analyse trends and other factors. extending the model with seasonal dummies

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

seasonally adjusted data

A

similar to deseasonalised data, but explicitly adjust to remove both upward and downward seasonal fluctuations.

20
Q

seasonal dummies

A

binary variables representing specific seasons in regression models. D1, D2, D3, D4 one of them is the based season

21
Q

base/benchmark season

A

the omitted season when using seasonal dummies. All the other seasonal effects are interpreted relative to this base season

22
Q

contrast variable

A

alternative to seasonal dummies to avoid the dummy variable trap (perfect collinearity). Express the difference between seasons instead of using absolute levels.

23
Q

standardised OLS coefficient (regression)

A

its obtained after standardising all the variables. Involves converting variables to have a mean of 0 and a standard deviation of 1. This makes the coefficient scale-independent, allowing comparison across variables with different units of measurement

24
Q

relative importance measure

A

the higher the absolute value of the standard OLS coef. the more important is the particular regressor in terms of dependant var.

25
Q

stochastic time series analysis

A

studies time series with inherent randomness, focusing on autoregressive (past levels affect the present) and moving average (learning from the error terms in the past) structure. Appropriate only for short-term forecasting. The series must be stable/stationary

26
Q

self-generative structure

A

the present value is influenced by its own past values and past shocks or errors.

27
Q

autoregressive process (AR)

A

the current value (yt) depends on its previous values (yt-1) and a random error term (ut)
p lag

28
Q

moving average process (MA)

A

the current value is influenced by the past errors term (ut-1)
q lag

29
Q

ARMA (p,q) model

A

combines AR and MA components into a single model

30
Q

ARIMA (p,i,q)

A

extends the ARMA model to handle non-stationary processes by differencing the data i times to achieve stationarity
p: order of AR
i: number of differences
q: order of MA

31
Q

stationary/stable time series

A

a time series is stationary if its mean, variance and autocovariance do not change over time

32
Q

autocovariance

A

the covariance betwen 2 values of the time series separated by k time units

33
Q

autocorrelation function ACF

A

measures the correlation between yt ( immediate influence) and yt-k (indirect)

34
Q

partial autocorrelation function PACF

A

measures the direct effect between values, removing the indirect effect (intermediates)

35
Q

correlogram

A

graph showing ACF and PACF for different lags

36
Q

white noise

A

constant expected value and constant variance whose subsequents are uncorrelated and independent.
random series with constant mean and variance, uncorrelated errors.
no autocorrelation exists between observations

37
Q

ljung-box test

A

test if residuals are white noise.
H0: no autocorrelation, white noise
H1: there is at least 1 autocorrelation

38
Q

random walk

A

AR (1) process with unit root, nonstationary, xero constant.
Current value depend solely on the previous value + a random error.
ACF not collapsing to zero

39
Q

unit root process

A

non stationary, shocks persist indefinitely

40
Q

trend stationary process

A

stabilised by removing a deterministic trend. Turning it into a stationary

41
Q

difference stationary process

A

a non stationary, but can be made stationary by taking the first ( higher order) differences, meaning differencing

41
Q

augment dickey-fuller test

A

to determine if a time series has unit root, by adressing potential autocorrelation in the residual. Add a lagged difference of the dependent ( d_yt)
problem: the lag order (p) is unknown and has to be determined somehow.

42
Q

unit root test

A

checks stationarity by testing for unit roots

43
Q

box-jenkins algorithm

A

for ARIMA
step1: check the stability of the process
step2: id needed, stabilise the time series with appropriate transformation (detrending= -trend; differencing)
step3: determine the (p;q) orders for the ARIMA model and run the estimation on the transformed stationary data (akaike, or z test)
step4: model diagnostics, the residual must be WN (correlogram, ljung-box test)
step5: forecast future out of sample values of the time series = invert all the transformations which were previously applied

44
Q

integrated process I(i)

A

a series that becomes stationary after i differences