245 Proofs Flashcards
Efficiency definition
Given two unbiased estimators θˆ 1 and θˆ 2 of a parameter θ, with variances V(θˆ 1) and V(θˆ 2), respectively, then the efficiency of θˆ 1 relative to θˆ 2, denoted eff (θˆ 1, θˆ 2), is defined to be the ratio:
eff(θ^ 1, θ^ 2) = V ( θˆ 2 ) / V ( θˆ 1 )
.
Likelihood definition
Suppose that the likelihood function depends on k parameters θ1, θ2, …, θk.
Choose as estimates those values of the parameters that maximize the likelihood:
L(y1, y2,…, yn |θ1,θ2,…,θk).
Invariance Property of MLE
If θ is the paramter associated with a distribution of some random variable Y, then for any function t(θ), where the maximum likelihood estimator of θ is θ^, then the MLE of t(θ) will be given by (t(θ))^ = t(θ^)
Power defintion
Suppose that W is the test statistic and RR is the rejection region for a test of a hypothesis involving the value of a parameter θ. Then the power of the test, denoted by power(θ), is the probability that the test will lead to rejection of H0 when the actual parameter value is θ. That is,
power(θ) = P(W in RR when the parameter value is θ).
Neyman-Pearson Lemma
Suppose that we wish to test the simple null hypothesis H0 : θ = θ0 versus the simple alternative hypothesis Ha : θ = θa , based on a random sample Y1,Y2,…,Yn from a distribution with parameter θ.
Let L(θ) denote the likelihood of the sample when the value of the parameter is θ. Then, for a given α, the test that maximizes the power at θa has a rejection region, RR, determined by
LR = L(θ0)/L(θa) < k
Now any other test that has a significance level less or equal to a, will have a power less or equal to the likelihood ratio test LR
Joint Distribution for n=2
Proof
Generalised Likelihood Ratio test
Definition
Conjugate Prior Distributions
Let L be a likelihood function (θ|x) . A class Π of prior distributions is said to form a conjugate family if the posterior density
p(θ|x) /= p(θ)(θ|x)
is in the class Π for all x whenever the prior density is in Π.
Suppose that θ ~ N(θ0,Φ0) and that X = (X1,…,Xn), where the Xi are independent and Xi ~ N(θ,Φ). Xi | θ ~ N (q , f ).
2.2
Definition
Remark 1 of Thm 2.2
Proof
Remark 2 of Thm 2.2
Proof
Remark 3 of Thm 2.2
Proof
Remark 4 of Thm 2.2
The posterior distribution can be used to calculate probabilities concerning θ. For a given probability 1- a, the interval A = [a1,a2] can be calculated such that P(θ an element of A) = 1- a. Alternatively, for a given interval B = [b1,b2], the probability b = P(θ an element of B), can be found.
Suppose that Φ ~ Inv-G(a,b) and that X = (X1,…,Xn), where the Xi’s are independent and Xi |Φ ~ N(μ,Φ).
2.3
Proof and Definition
Remark 1 Thm 2.3
Proof