Optimality in Estimation Flashcards

1
Q

Define the risk function

A

R(delta, theta) = E[ L(delta(X), theta) ] = integral_X L(delta(x), theta) f_theta(x) dx

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What is the risk function for the mean square error loss function L(a,theta) = (a-theta)^2 ?

A

R(delta, theta) = MSE(delta)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What are the two main applications of decision theory?

A
  • estimation: delta(X) = estimator of theta(X)
  • hypothesis testing: delta(X) is a test using the binary loss function 1{delta =/= theta} and R(delta, theta) is the probability of making type I or II error
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Define inadmissible

A

there exists a delta*(X) st

  • R(delta*, theta) <= R(delta, theta) for all theta
  • and strict inequality for some theta
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Define a pi-Bayes risk and a pi-Bayes decision rule

A
  • R_pi(delta) = integral_Theta R(delta, theta) pi(theta) dtheta = integral_Theta integral_X L(delta(x), theta) f_theta(x) dx pi(theta) dtheta
  • delta_pi is any decision rule minimizing R_pi(delta)
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Define the posterior risk function

A

R(delta, theta) = E[ L(delta(X), theta) | X] = integral_X L(delta(x), theta) pi(theta | x) dx
for posterior distribution pi(theta | x)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

2 strategies to find the pi-Bayes decision rule:

A
  • Find the minimizer of the posterior risk because if delta minimizes the posterior risk then it also minimizes the Bayes risk
  • directly minimize Bayes risk
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

What is the minimizing decision rule of posterior risk for:

  • mean squared error loss?
  • absolute value?
A
  • posterior mean

- posterior median

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

2 strategies for providing admissibility

A
  • show estimator is the unique Bayes estimator for some prior pi
  • from scratch (eg. by contradiction)
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Define minimax risk

A

inf_delta sup_theta R(delta,theta)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

3 situations when minimax risk is atteined

A
  • Bayes rule with constant risk
  • Bayes rule whose Bayes risk R_pi(delta_pi) = sup_theta R(delta_pi, theta)
    (uniqueness of delta_pi => uniqueness of minimax)
  • admissible estimator delta with constant risk
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Define UMVUE

A

UMVUE g^ is an unbiased estimator of g(theta) such that:

for all theta: var (g^) <= var (g~) for any other unbiased g~

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Relation between complete and minimal statistic T(x)

A

T(x) sufficient and complete => T(x) minimal

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

State Basu’s theorem

A

if T complete => any ancillary statistic V is independent of T
(ancillary statistics has distribution that doesn’t depend on theta)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

State Lehman-Scheffe theorem

A

if T sufficient and complete and g~ an unbiased estimator of g(theta)
then g^(T(X)) = E[g~(X)|T(X)] is the UMVUE of g(theta)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Strategies to find a UMVUE

A

If T complete and sufficient:

  • g^(T(X)) = E[g~(X)|T(X)] for unbiased g~(theta)
  • h(T) unbiased is unique UMVUE

else:
- an estimator that atteins CR lower bound for all theta

17
Q

Link UMVUE and Cramer Rao lower bound

A
  • if UMVUE exists and if CR bound is atteined then UMVUE atteins CR bound
  • same for uniformely atteining CR bound