8 | Parameter estimation 2: likelihood-based methods Flashcards
Consider a regression model
yi = f(zi;θ) + εi.
What are the advantages of using maximum likelihood estimation compared to least squares estimation?
☐ In addition to estimation, the maximum likelihood estimate can be used for model selection.
☐ Maximum likelihood estimation is possible for a larger class of models
☐ Maximum likelihood estimation works beyond regression models, whereas least squares estimation is restricted to regression models.
☐ Maximum likelihood estimation yields a better estimate θ^ML
than least squares estimation.
(ungraded quiz)
In addition to estimation, the maximum likelihood estimate can be used for model selection.
Maximum likelihood estimation is possible for a larger class of models
When does p(y;θ,σ2) = ∏1<=i<=n p(yi;θ,σ2)
hold?
☐ If the errors are normally distributed
☐ If the parameters are normally distributed
☐ If the parameters are independent
☐ If the errors are independent
(ungraded quiz)
If the errors are independent
Among the following models, select the ones that are nested:
Question 7Answer
fA(t;θ) = tθ1 and fB(t;θ) = tθ1 / (t+θ2)
fA(t;θ) = tθ1/(t + θ2) and fB(t;θ) = t*θ1^θ3 / (t^θ3+(θ2)^θ3)
fA(t;θ) = θ1+ tθ2 and fB(t;θ) = (θ1+tθ2)^2
fA(t;θ) = θ1e^(−tθ2) and fB(t;θ) = θ1 + θ2e^(−θ3t)
(ungraded quiz)
fA(t;θ) = tθ1/(t + θ2) and fB(t;θ) = t*θ1^θ3 / (t^θ3+(θ2)^θ3)
(we can choose θ=(θ1,θ2,1) in fB to obtain fA)
fA(t;θ) = θ1e^(−tθ2) and fB(t;θ) = θ1 + θ2e^(−θ3t)
( we can choose θ=(0,θ1,θ2) in fB to obtain fA).
Assume three models have been estimated using maximum likelihood estimation, as described in the table below. Which one would you choose according to AIC?
Model A Model B Model C
log-likelihood 27 30 31
number of parameters 2 3 5
(ungraded quiz)
AICs:
Model A: 227 -22 = 50
Model B: 230 - 23 = 54
Model C: 2* 31 - 2*5 = 52
–> Model B
AIC?
The Akaike information criterion (AIC) of a model f with k
parameters is defined as
AIC = 2 logL(θˆML, σˆ2ML) − 2k