Advanced Econometrics Flashcards

1
Q

Explain the 2 estimations techniques seen in this course. For each estimation technique, give one economic application.

A

-OLS: estimated parameters are obtained by minimizing the sum of squared residuals.
The OLS estimator, b = (X’X)-1 X’Y
Example of application: any classical static linear regression/ AR model

-ML: estimated parameters are obtained by maximizin g the likelihood function ( the probability of observing the data)

Example of application: MA, ARMA, binary response model

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

{yt} is martingale difference sequence, is {yt} an indepedent white noise (IWN) process?

A

{yt} is a martingale difference sequence. It implies that it is also a white noise process. However, it is not an independent white noise since we may observe that the second moment depends on pas values of the series.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Explain the 3 test statistics: Wald, Lagrange Multiplier and Likelihood ratio test.

A
  • Wald test: If the restriction is valid, alpha(θ) should be close to zero since MLE is a consistent estimator. Therefore, we based the test on the aplha(θ). We reject the null hypothesis if the restriction is sgnificantly different from zero.
  • Lagrange Multiplier: If the restriction is valid, the restricted estimator should be near the point that maximizes the log likelihood functino. The slope of the log likelihood function at the restricted point should be close to zero. We based the test on the slope of the log likelihood function where the function is maximized subject to the restriction.
  • LR: If the restriction is valid, then imposing it should not lead to a large decrease in the log likelihood function. We based the test on the difference Qn(teta^)-Qn(θ~) where teta^ is the unconstrained estimator and Qn(θ~) is the constrained estimator.

-

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Explain the general mining of the autocorrelation coefficients and the partial autocorrelation coefficients. What additional information do we obtain with partial autocorrelation?

A

The j-th autocorrelation coefficient of the C-S scalar process {yt} denoted by ρj. = equation cov.

This is the correlation with j-th partial autocorrelation coefficient (PAC) denoted by aij, comes from the effect of intervening variables yt-1,…, yt-j+1.

Example in the AR (+) model, Corr(yt, yt-2) = ϕ1^2 not equal to zero. Even though yt-2 does not appear in the model, there is a “transmission” effect :Corr(yt, yt-2) = Corr(yt, yt-1)*Corr(yt-1, yt-2).
The autocorrelation coefficient ρj includes these transmission effects whereas ajj eliminates the effect of intervening variables.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Explain why we use autocorrelation and heteroscedasticity robust standard errors.

A

Heteroscedasticity robust standard errors are the only valid standard errors when the data are heteroscedastic. Even if the null of homoscedasticity is not rejected, heteroscedasticity-robust standard errors are asymptoticaly consistent.
When dealing with autocorrelated time series, it is also more prudent strategy to employ standard errors robust to autocorrelation ( in case some autocorrelation would remain in the residuals ).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Explain how we obtain the Portmanteau test, what it means, and what it implies for AR(1) estimation results.

A

Here we test the null of no autocorrelation in the residuals of the AR regression until the order p. The presence of no autocorrelation in the residuals makes the OLS estimated parameters inconsistent as the regressors in the AR equation are not predetermined.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Does it help that we have Newey-West standard errors?

A

No, the autocorrelation-robust standard errors do not help with the inconsistency of the parameters. Therefore, we absolutely need to remove autocorrelation from residuals when we estimate by OLS.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Explain the difference between the assumption of strict exogeneity and the assumption of predetermined regressors.

A

The assumption of strict exogeneity is E(ei¦X) = , and implies (by LTE) that E(eiXj) = 0 pour tout i,j. This means that the error term of individual i is orthogonal to its own regressors, as well as to the regressors of all other individuals. It also means that the regressors of individual i are orthogonal to the error term of all individuals.
The assumption of predetermined regressors is E(eiXi) = 0 pour tout i. This assumption is weaker than strict exogeneity as it only requires that the error term of individual i is orthogonal to its own regressors.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

What implies strict exogeneity, predetermined regressors and m.d.s ?

A

Strict exogeneity implies that E(eiyi-1) = 0 pour tout i,j; each regressors, at any date, is orthogonal to the error term at all dates.

Predetermined regressors means E(eiyi-1) = 0 pour tout i; the error terms are orthogonal to the current regressors (here yi-1)

The assumption of a m.d.s for gi=eiyi-1 implies E(eiyi-1) = 0 pour tout i, j< i . The error terms are orthogonal to current and past regressors.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly