Bayesian inference Flashcards
Define prior and posterior distribution, and give formulas for the posterior distribution of π(θ|x)
The prior distribution π(θ) of θ is the probability distribution of θ before observing the data. It represents our beliefs or uncertainty about the parameter before collecting any data. After observing data X = x, we update the distribution of θ to obtain the posterior distribution π(θ|x) representing our updated beliefs in light of seeing x.
pg 31
What is an improper prior?
A non-negative prior function where the integral over the parameter space is not finite.
Define a Jeffreys prior. Is it always proper?
Prior proportional to sqrt[ det( I(θ) ) ]
No
pg 34
Define a loss function
Non negative function that determines the cost of a particular action for a given parameter θ
Define the risk function for loss function L and decision rule δ
E[ L(δ(X), θ) ] = integral
pg 37
When is a decision rule δ inadmissible?
pg 38
Define the π-Bayes risk for decision rule δ
pg 38
The estimator that minimizes this risk is called the Bayes estimator
What is the posterior risk?
The average loss under the posterior distribution for an observation X
pg 39
Does minimizing the posterior risk also minimize the π-Bayes risk?
Yes (proof on pg 39)
What is the minimax risk?
The minimax risk is defined as the infimum (‘min’) over all decision rules δ of the maximal (‘max’) risk over the whole parameter space Θ
pg 40
What happens if a Bayes rule δ has constant risk in θ?
If a (unique) Bayes rule δ has constant risk in θ then it is (unique) minimax.
What is a uniformly minimum variance unbiased estimator?
An unbiased estimator g^(X) of g(θ) s.t. var(g^) <= any other unbiased estimator g(X) of g(θ)
What does it mean to say a statistic is complete for θ?
if E_θ [g(T)] = 0 for all θ then P_θ (g(T)=0) = 1
pg43
Give an example of a complete statistic for a k-parameter exponential family
T = ( T1(X), …, Tk(X) )
Can a sufficient, complete satistic be minimal?
If a sufficient statistic T is complete, then it is minimal, but not all minimal sufficient statistics are complete
Define ancillary statistic
A statistic is an ancillary statistic if its distribution does not depend on the parameter θ.
State Basu’s theorem
If T is a complete sufficient statistic for θ, then any ancillary statistic V is independent of T.
State the Lehmann-Scheffe theorem.
Let T be a sufficient and complete statistic for θ and
g˜ be an unbiased estimator of g(θ) with var_θ(˜g) < ∞ for all θ∈ Θ. If gˆ(T(X)) = E[˜g(X)|T(X)],
then gˆ is the unique uniformly minimum variance unbiased estimator (UMVUE) of g(θ).
What is the likelihood principle?
The likelihood principle says that all bayesian inferences are based on the likelihood function only.
ie. if the likelihood function with two different thetas are the same then all their inferences (prior, MLE,…) are the same
How can we prove T is not complete?
Find a function g such that E(g(T))=0 but g(T) =/= 0