Estimators Flashcards
If E(X) is known or assumed to be μ, what is E(X̄)?
μ (no further assumptions needed)
What is the method of moments?
Using an estimator derived from a sample as the estimated population parameter (also called analogue estimation)
How is an estimator usually denoted?
A variable with a hat
What is required to make an interval estimate?
Knowing the sampling distribution of the estimator
What is the 95% confidence interval for the estimator for the mean of a random sample from a normal distribution?
The CI for sqrt(n)(mu hatn - mu)/σ is [-1.96, 1.96] which can be rearranged to get the CI for the estimator alone
What is a pivotal quantity?
A function of the data and some unknown parameters whose distribution is known
What is the bias of an estimator?
The difference between its expected value and true value
bias(theta hat) = E(theta hat) - theta
Why is (mu hat)n an unbiased estimator of mu?
If Xi for i = 1, …, n are IID as N(μ, σ2) then (Mu hat)n = 1/n * Σi=1nXi ~ N(μ, σ2/n) therefore also has expectation mu and is therefore unbiased
What is an example of an unbiased but implausible estimator?
(N hat) = 2X - 1 for a distribution with P(X = x) = 1/N for x = 1, 2, …, N
The expectation of the distribution is (N+1)/2 so the estimator is unbiased but a sample may show that N >= a and the estimator could still give (N hat) < a which would clearly be incorrect
What is an efficient estimator?
The unbiased estimator with the lowest variance
What is the mean squared error of an estimator?
MSE(theta hat) = E((theta hat - theta)2 = Var(theta hat) + (bias(theta hat))2
Offers tradeoff between bias and variability
What is the sample variance from a random sample with E(Xi) = μ and Var(Xi) = σ2?
Sn2(X1, X2, …, Xn) = 1/n * Σi=1n(Xi - X̄n)2 with X̄n = 1/n * Σi=1nXi so Sn2 = 1/n * (Σi=1nXi2 - nX̄n2)
What is E(Xi2)?
Var(Xi) + (E(Xi))2 = σ2 + μ2
What is E(Sn2)?
E(Xi2) - E(X̄n2) = σ2 + μ2 - (σ2/n + μ2) = σ2(n-1)/n
Is the sample variance an unbiased estimator?
No because it’s expectation is not exactly equal to σ2, however as n approaches infinity the bias approaches zero (so asymptotically unbiased)