7. MSE and the UMVUE Flashcards
MSE
Mean Square Error, most popular criteria for evaluating an estimator, which represents the risk under quadratic loss
MSE = E[(T-@)^2] = V[T] + B^2[T]
Unbiased
E[T]=g(@)
UMVUE
(optimality) uniformly minimum variance unbiased estimator:
E[T]=g(@) + V[T] =< V[T] for any T unbiased estimator
Unbiasedness vs Variability
accuracy vs precision
Rao-Blackwell th.
Given U, an unbiased estimator for g(@) and T a SUFF stat for @, then E[U|T] is an unbiased estimator with variance smaller or equal to U.
Any unbiased estimator can be improved using the SUFF
Proof of Rao Blackwell
- T is a statistic
- it is unbiased (using the tower property)
- the variance is less or equal
- it’s only equal if T*=U with probability 1
Lehman-Scheffé #2
Given T a COMP SUFF for @ and T=h(T) and unbiased estimator for g(@), then T is the UMVUE for g(@):
The UMVUE is always a function of the min suff
Since the UMVUE must be @=E[@|T] where T is a suff stat, then it’s a function of a suff stat + a suff stat is always a function of the min suff
Proof of Lehmann-Scheffé #2
- UNIQUENESS: let’s suppose both A and B are functions to T and unbiased for g(@) then E[ A-B]= E[A] - E[B]= 0.
Notice that their difference is also a function of T, combining this with completeness of T we get that Pr{A=B}=1 - MINIMAL VARIANCE
For an unbiased estimator U, by RB E[U|T] is unbiased, and with lower variance. Since by the previous point, there exists only one unbiased estimator function of T with these characteristics, and U is arbitrarily chosen, then the variance of such estimator is minimal and then it’s the UMVUE