Optimality in Estimation Flashcards
Define the risk function
R(delta, theta) = E[ L(delta(X), theta) ] = integral_X L(delta(x), theta) f_theta(x) dx
What is the risk function for the mean square error loss function L(a,theta) = (a-theta)^2 ?
R(delta, theta) = MSE(delta)
What are the two main applications of decision theory?
- estimation: delta(X) = estimator of theta(X)
- hypothesis testing: delta(X) is a test using the binary loss function 1{delta =/= theta} and R(delta, theta) is the probability of making type I or II error
Define inadmissible
there exists a delta*(X) st
- R(delta*, theta) <= R(delta, theta) for all theta
- and strict inequality for some theta
Define a pi-Bayes risk and a pi-Bayes decision rule
- R_pi(delta) = integral_Theta R(delta, theta) pi(theta) dtheta = integral_Theta integral_X L(delta(x), theta) f_theta(x) dx pi(theta) dtheta
- delta_pi is any decision rule minimizing R_pi(delta)
Define the posterior risk function
R(delta, theta) = E[ L(delta(X), theta) | X] = integral_X L(delta(x), theta) pi(theta | x) dx
for posterior distribution pi(theta | x)
2 strategies to find the pi-Bayes decision rule:
- Find the minimizer of the posterior risk because if delta minimizes the posterior risk then it also minimizes the Bayes risk
- directly minimize Bayes risk
What is the minimizing decision rule of posterior risk for:
- mean squared error loss?
- absolute value?
- posterior mean
- posterior median
2 strategies for providing admissibility
- show estimator is the unique Bayes estimator for some prior pi
- from scratch (eg. by contradiction)
Define minimax risk
inf_delta sup_theta R(delta,theta)
3 situations when minimax risk is atteined
- Bayes rule with constant risk
- Bayes rule whose Bayes risk R_pi(delta_pi) = sup_theta R(delta_pi, theta)
(uniqueness of delta_pi => uniqueness of minimax) - admissible estimator delta with constant risk
Define UMVUE
UMVUE g^ is an unbiased estimator of g(theta) such that:
for all theta: var (g^) <= var (g~) for any other unbiased g~
Relation between complete and minimal statistic T(x)
T(x) sufficient and complete => T(x) minimal
State Basu’s theorem
if T complete => any ancillary statistic V is independent of T
(ancillary statistics has distribution that doesn’t depend on theta)
State Lehman-Scheffe theorem
if T sufficient and complete and g~ an unbiased estimator of g(theta)
then g^(T(X)) = E[g~(X)|T(X)] is the UMVUE of g(theta)