Week 11: Chapters 10 and 11 Flashcards

1
Q

What are the differences between Time Series and Cross-Sectional data

A
  • Temporal ordering of observations, where the past affects the future
  • Time series data represents a random variable (a draw of a stochastic process)
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What are the four types of time series models?

A
  • Static models
  • Distributed lag models (DL)
  • Autoregressive models (AR)
  • Autoregressive distributed lah models (ARDL)
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What are the two main reasons for using a static model?

A
  • To see if the changes in Xt immediately affect yt
  • To see what the trade off is between x and y
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What is the B1 in the DL model represent?

A

The impact propensity (Multiplier)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Write the equation for a DL model

A

Yt = 𝛽0 + 𝛽1xt + 𝛽2xt-1 𝛽3xt-2 + ut

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Equation for AR model?

A

Yt = 𝛽0 + 𝛽1yt-1 + ut

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

What does an AR model do?

A

Uses past variables of y to explain contemporaneous values of y

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

What are the first three TS assumptions that result in the OLS estimators being unbiased?

A

TS1: The regression is linear in its parameters
TS2: No perfect collinearity
TS3: Zero Conditional Mean

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Under TS3: What is the formal definition of STRICT exogeneity?

A

Corr(Xsj, Ut) = 0, even when s does NOT = t

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Under TS3: What is the formal definition of Contemporaneous exogeneity?

A

Corr(Xtj, Ut) = 0, for all values of j

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

TS1-5 (BLUE)
and TS1-6

A
  1. Linear in its Parameters
  2. No perfect collinearity
  3. Zero Conditional Mean
  4. Homoskedasticity
  5. No serial correlation
  6. Normality
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

TS5: No serial correlation definition

A

Corr (Ut, Us) = 0, for all t does not = s
- Errors are not correlated over time

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Why was TS5 not a problem in cross-sectional data?

A

Due to MLR2, as the random sampling ensured ui and uh were indepdenent

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

In deterministic trends, what does 𝛽1 >< 0 mean?

A

𝛽1 > 0: Upward trend
𝛽1 < 0: Downward trend

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

What is a spurious regression?

A

It is finding a relationship between one or two variables, simply because each is growing over time

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Write the final equation for de-trending variables

A

see notes

17
Q

Why are R^2 higher in time series?

A
  1. Time series data is often in an aggregate form and thus less variance to explain
  2. Trending dependent variables
18
Q

When is a stochastic process stationary?

A

If the probability distribution of the stochastic process does not change over time
- If we take any stochastic process and shift it ahead h periods of time, the joint probability distribution looks the same
- The correlation between adjacent terms is the same across all time periods
- Constant mean, constant variance and no trends/seasonality

19
Q

Draw the graphs for perfect stationarity, non-constant mean, non-constant variance and seasonality

A

notes

20
Q

When does covariance stationarity hold?

A
  • E(xt) is constant
  • Var(xt) is constant
  • for any t, h, Cov(xt, xt+h) depends only on h and not on t
21
Q

What is weak dependence?

A

Places restrictions on how strongly related the random variables xt and xt+h are as h increases

22
Q

A stationary series is weakly dependent if:

A

Xt and Xt+h are β€˜almost’ independent as h increases
- Corr(xt, xt+h) goes to zero as h increases

23
Q

What does a weakly dependent time series look like?
- Xt:

A

xt = et + Ξ±(et-1)
xt: Moving average process of order one (MA)
- weighted average of et and et-1

24
Q

What does a weakly dependent time series look like?
- Yt:

A

Yt: p1yt-1 + et
- The autoretrogressive process of order one, AR(1)
- When |p1| < 1, the AR(1) process is stable and weakly dependent

25
Q

Can a trending series be weakly dependent?

A

Yes, as long as the time trend is controlled for

26
Q

What is a trend stationary process?

A

A series that is stationary around its time trend and weakly dependent

27
Q

New TS assumptions under asymptotic properties of OLS:

A

TS1’: Linearity and weak dependence
TS2: No perfect collinearity
TS3’: Zero Conditional Mean
TS4’: Homoskedasticity
TS5’: No serial correlation

28
Q

What happens when TS1’ - TS3’ hold?

A

OLS estimators are consistent and PlimE(b^j) = bj

29
Q

TS3’

A

Zero conditional mean,
Now its contemporaneous exogeneity such that E(ut, xt) = 0

30
Q

TS4’

A

homoskedasticity,
Var(ut, xt) = Οƒ^2
Errors are contemporaneously homoskedastic

31
Q

TS5’

A

No serial correlation
Corr(ut, us|xt, xs) = 0

32
Q

What happens when TS1’-5’ HOLD?

A

OLS estimators are asymptotically normally distributed

33
Q

Whats the most common violation of weak dependence?

A

Problems will arise from highly persistent data

34
Q

Random walk equation

A

yt = yt-1 + et,
y0 is independent of et for t>1
- yt is an accumulation of all past shocks and an initial value
- the effect of the shocks will be contained in the series forever

35
Q

Show that the variance of yt changes over time in a random walk

A

notes

36
Q

What are the two main implications of the random walk?

A
  • Random walks is not covariance stationary
  • Random walks are not weakly dependent
37
Q

Random walk with a drift

A

yt = Ξ±0 + yt-1 + et, where Ξ±0 is the drift term

38
Q

What does an I0 weakly dependent series mean?

A
  • I0: Integrated of order zero
  • Nothing needs to be doe to the series to use it in the regression
  • Their averages satisfy the standard limit theorems
39
Q

Whats the benefit of differencing a time series?

A
  • It removes any time trends
  • See notes for more