Chapter 6: Point Estimation Flashcards
1
Q
Point Estimate
A
2
Q
Point Estimator
A
3
Q
General Concepts of Point Estimation
A
4
Q
General Concepts of Point Estimation (contd.)
A
5
Q
Example 2
A
6
Q
Example 2 contd.
A
7
Q
Accurate Estimator
A
- for some samples, the estimator will yield a larger value while others will underestimate
8
Q
Expected or mean squared error
A
9
Q
Unbiasedness
A
10
Q
Unbiased Estimators
A
11
Q
Unbiased Estimators (contd.)
A
- Suppose (theta) is an unbiased estimator;
- then if theta= 100, the (theta-hat) sampling distribution is centered at 100;
- if theta = 27.5, then the (theta-hat) sampling distribution is centered at 27.5, and so on.
12
Q
Recognizing unbiasedness without knowing parameter
A
13
Q
Unbiasedness Equation Theorem
A
14
Q
Principle of Unbiased Estimation
A
15
Q
Calculating Unbiased Estimator
A
16
Q
A
17
Q
Estimators with Minimum Variance
A
18
Q
Estimators with Minimum Variance (contd.)
A
19
Q
Example 7
A
20
Q
Example 7 Solution
A
21
Q
Example 7 Solution (contd.)
A
22
Q
Reporting a Point Estimate: The Standard Error
A
23
Q
Example 9
Example 2 continued…
A
24
Q
More on The Standard Error
A
25
Point Estimation - to summarize
26
Point Estimation - to summarize (contd.)
27
Formulating Estimators
28
The Method of Moments (MoM)
The basic idea of MoM :
* Equate certain **_sample_** characteristics, such as the mean, to the corresponding population expected values.
* Then solving these equations for unknown parameter values yields the estimators.
29
The Method of Moments (MoM) (contd.)
30
Example 12
31
Maximum Likelihood Estimation
The method of **_maximum likelihood_** was first introduced by R. A. Fisher, a geneticist and statistician, in the 1920s.
Most statisticians recommend this method, at least when the sample size is large, since the resulting estimators have certain desirable efficiency properties.
32
Example 15