Econometrics 4: ML, Logit and Probit Flashcards
What is a parametric model? How does it differ from a non-parametric or semi-parametric model?
In a parametric model, we specify a functional form for our equation and aim to estimate the parameters that best fit this model to the data. Examples include OLS, NLLS, and ML estimation.
Nonparametric models aim instead to estimate the function that best fits the data.
Semiparametric models use a combination.
What does it mean for a parametric model to be ‘nonlinear in the variables’, as opposed to ‘nonlinear in the parameters’?
If a parametric model can be written in the form f(y) = ∑β_ig(x1,…xn), it is linear in parameters. If f and g are linear, the model is also linear in the variables.
Can we use OLS to estimate models that are nonlinear in the variables, but not in the parameters?
Yes.
Suppose we estimate the model lny ~ x. How do we interpret β?
When x increases by one unit, y increases by 100β%.
Suppose we estimate the model y ~ ln x. How do we interpret β?
When x increases 1%, Y increases 0.01β units.
Suppose we estimate ln y ~ ln x. How do we interpret β?
Elasticity of y w.r.t. x.
What is nonlinear least squares?
Nonlinear least squares is an extremum estimator that estimates parameters by minimising the sum of squared residuals ∑(y - g(x,θ))^2.
What is an ML estimator?
Maximum likelihood estimators find estimates of parameters that maximise the likelihood of seeing the data.
Why is linear regression poorly suited for binary choice models?
- The error term is guaranteed to be heteroscedastic.
- Predicted probabilities may not fall between 0 and 1.
What is pseudo-R^2?
1-l(θ)/l(y)
The amount of variation explained by the model, compared to that explainable by a constant.
Are the ML estimators of binary choice models asymptotically normal?
Yes.
Explain how Gauss-Newton approximation works.
If we have a system of nonlinear equations to solve, we can numerically approximate the solution using the Gauss-Newton method.
We start by taking the first-order Taylor approximation of the system around some initial estimate, generating a system we can solve using OLS. Iterate with this new value as the estimate.
Discuss the asymptotic properties of MLE.
Under fairly general conditions, the ML estimators are asymptotically normal and consistent.