Estimators Asymptotics and Comparisons Flashcards
What does a sequence of Bin(n, p/n) RVs coverge too
A sequence of Bin(n, p/n) converges to a
Poisson RV as n increases
Name two types of convergence
Convergence in probability and in distribution
Which type of convergence is stronger
Convergence in probability is stronger than convergence in distribution so convergence in probability implies convergence in distribution
Which convergence implies another
Convergence in probability implies convergence in distribution
What inequality so we use to show convergence in probability
Chebyshevs
State the WLLN
Let X1, X2, . . . be a sequence of IID random variables with E [Xi ] = μ and Var (Xi ) = σ^2 .
Then Xbar_n)=1/n(sum of Yis up to n)
is convergent in probability to μ
What is a consistent estimator
Means the estimator converges in probability to the population parameter. Consistency is generally regarded as bare minimum property an estimator
should satisfy.
• Note: A biased estimator may still be consistent
Is the sample mean consistent
Yes, consistent to/for μ
Define the variance of an estimator
The variance of an estimator, already defined, measures how variable the
values of the estimator are
Define the standard error
Let ˆθ be an estimator for θ. The standard error of θestimator is defined as the square
root of its variance
What is the MSE of an estimator in terms of other measures
MSE = Var + Bias^2
If an estimator is unbiased what can we say about the MSE
MSE= Variance of estimator
If we sum m exponential RVs with rate lamda what is their distribution
Gamma(m, lamda)