Lecture 6 Flashcards
When is an estimator unbiased?
An estimator is unbiased if its expected value is equal to the true parameter
When is an estimator consistent?
An estimator is consistent if, as sample size increases, it converges in probability to the true parameter
When is an estimator efficient? What condition is associated with this characteristic?
For unbiased estimators only.
It is more efficient than another if its variance is smaller.
When is an estimator sufficient?
It is sufficient if ti utilizes all of the information in a sample relevant to the estimation of the population
When is an estimator robust?
It is robust if its sampling distribution is not seriously affected by violations of assumptions
What are the three estimation methods?
Method of least squares
Method of maximum likelihood
Bayesian method
What is the method of least squares?
The estimator is obtained by minimization of the residual variance (least squares); based on data
What is the method of maximum likelihood?
The estimator is obtained by maximization of the likelihood function of the data; based on data
What is the Bayesian Method?
the Estimator is obtained by maximizing the posterior distribution of the parameter given by the observed data; based on data and prior information
Who discovered the maximum likelihood method and when?
Fischer in 1921
What is the conceptual question behind the likelihood method?
What values of the parameters theta are most likely to yield the observed data sample y?
What are the 6 sample properties of MLE?
Asymptotically unbiased Sufficient Consistent Asymptotically minimum variance Asymptotically normally distributed Invariant
What are likelihood ratio tests for? (3 points)
They are used to compare the fit of full and reduced models, where the reduced model has r parameters less than the full model to be estimated from the data. They are used for a sufficiently large sample size.