Chapter 6: Point Estimation Flashcards
Point Estimate
Point Estimator
General Concepts of Point Estimation
General Concepts of Point Estimation (contd.)
Example 2
Example 2 contd.
Accurate Estimator
- for some samples, the estimator will yield a larger value while others will underestimate
Expected or mean squared error
Unbiasedness
Unbiased Estimators
Unbiased Estimators (contd.)
- Suppose (theta) is an unbiased estimator;
- then if theta= 100, the (theta-hat) sampling distribution is centered at 100;
- if theta = 27.5, then the (theta-hat) sampling distribution is centered at 27.5, and so on.
Recognizing unbiasedness without knowing parameter
Unbiasedness Equation Theorem
Principle of Unbiased Estimation
Calculating Unbiased Estimator
Estimators with Minimum Variance
Estimators with Minimum Variance (contd.)
Example 7
Example 7 Solution
Example 7 Solution (contd.)
Reporting a Point Estimate: The Standard Error
Example 9
Example 2 continued…
More on The Standard Error
Point Estimation - to summarize
Point Estimation - to summarize (contd.)
Formulating Estimators
The Method of Moments (MoM)
The basic idea of MoM :
- Equate certain sample characteristics, such as the mean, to the corresponding population expected values.
- Then solving these equations for unknown parameter values yields the estimators.
The Method of Moments (MoM) (contd.)
Example 12
Maximum Likelihood Estimation
The method of maximum likelihood was first introduced by R. A. Fisher, a geneticist and statistician, in the 1920s.
Most statisticians recommend this method, at least when the sample size is large, since the resulting estimators have certain desirable efficiency properties.
Example 15