Tutorial 1 Flashcards

1
Q

What are the implications of strict exogeneity for the relationship between explanatory variables and the error term in a time-series regression?

A
  • mean independent of the error term in all time periods i.e. uncorrelated
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What assumption is needed to derive the OLS estimator and why?

A
  • no perfect multicollinearity
  • OLS estimation involves inverting the X’X matrix, if there is perfect multicollineartiy then it is singular (non-invertible)
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Conditions for Unbiased OLS Estimator

A
  • linearity
  • strict exogeneity
  • no perfect multicollinearity
  • random sampling
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What is the finite sampling distribution of an estimator?

A

the probability distribution of the estimator’s values across different random samples drawn from the same population

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Why is the sampling distribution of interest?

A
  • allows us to draw statistical inference (conduct hypothesis tests about the unknown population parameter)
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

What assumptions are needed about error terms to derive the finite sampling distribution?

A
  • error terms are normally distributed with mean zero and constant variance
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Consequence of weak exogeneity

A
  • the OLS estimator is no longer unbiased as unbiasedness requires holding for all possible realisations of X, not just conditional on a specific X
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

What is meant by consistency and large sample (asymptotic) sampling distribution?

A
  • consistency ensures that the estimator converges to the true parameter as the sample size increases. Similar to unbiasedness in finite samples.
  • asymptotic sampling distribution of an estimator is the probability distribution it approaches as the sample size goes to infinity, serving as an approximation of the true sampling distribution.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Assumptions for consistency and normal asymptotic sampling distribution?

A

for consistency:
- linearity
- weak exogeneity
- no perfect multicollinearity
in addition, for asymptotic normality:
- homoscedasticity
- no autocorrelation

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

What is residual autocorrelation?

A
  • the error terms are correlated
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Test for residual autocorrelation?

A
  • durbin-watson test
  • breusch-godfrey test
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

How might you deal with residual autocorrelation?

A
  • if the error terms are serially correlated, we can still apply OLS (still unbiased) but need to use newey-west standard errors as they are robust to heteroscedasticity and autocorrelation
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

What is heteroscedasticity?

A
  • the variance of the error term is not constant, conditional on X
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

What are the consequences of heteroscedasticity?

A
  • OLS estimator remains unbiased but is no longer efficient (BLUE)
  • Standard errors are biased, standard inference is invalid
  • To resolve this, apply robust standard errors
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Heteroscedasticity tests

A
  • breusch-pagan test
  • white test
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

How might you deal with heteroscedasticity?

A
  • if the error terms are heteroscedastic we can still apply OLS (still unbiased) but need to use robust standard errors
17
Q

what does weak exogeneity mean?

A
  • if they are weakly exogenous, the error term is mean independent of current explanatory variables
18
Q

Breusch-Pagan Procedure

A
  • Fit the regression and obtain the residuals
  • Square the residuals and regress them on the explanatory variables
  • calculate the test statistic
  • compare with critical value from chi-squared distribution, based on degrees of freedom
19
Q

Breusch-Pagan Hypotheses

A
  • H0: Homoscedasticity
  • H1: Heteroscedasticity
20
Q

Breusch-Pagan Decision Rule

A
  • If LM > critical value => reject null
  • Otherwise, fail to reject
21
Q

Breusch-Pagan Test Statistic

A

LM = n * R2
- n = sample size
- R2 = coefficient of determination