SU3 - Elements of Finite Sample Properties, Asymptotic Theory, Confidence Intervals and Hypothesis Testing Flashcards
An estimate is a ?, but an estimate is a ?
An estimate is a number, but an estimator is a random variable
What are finite sample properties?
Properties of an estimator when the sample size is not arbitrarily large
What is an unbiased estimator?
An estimator that does not over or under-estimate the true population parameter
if we could draw repeated random samples on Y from the population and then compute an estimate repeatedly, the average of these estimates should be close to θ.
What is i.i.d?
Independently and identically distributed
Var (Y bar) = variance / n
What happens to Var (Y bar) when sample size (n) increases?
Var (Y bar) (bias) goes towards 0.
Increasing sample size will lead to a decrease in sampling variance of the sample mean
If there are two unbiased estimators, how to see which is more efficient?
Relative efficiency is used. Smaller variance = more efficient
What is used to compare estimators that are bias?
Using Mean Squared Error.
Large MSE means variance or bias is large.
what are asymptotic properties?
Statistical properties when n is very very large.
It enables us to answer the following questions:
Does the variance of some unbiased estimator decrease as n↑?
Does the estimator become more precise as n↑?
Does the bias of an estimator shrink towards zero as n↑?
What is the distribution of the estimator when n is large?
𝑷(|𝜃𝑛−𝜃|>𝜀)→0 as 𝑛→∞
What is consistency?
if 𝜃n is consistent, it will become more improbable for the error |𝜃𝑛−𝜃| to exceed 𝜀
If (𝜃𝑛) is unbiased and 𝑉𝑎𝑟(𝜃𝑛)→0 as 𝑛→∞, then (𝜃𝑛) is consistent.
what is plim?
“probability limit” of an estimator. It is the value that the estimator converges to in probability when the sample size becomes arbitrarily large.
What is slutsky theorem?
If 𝑝𝑙𝑖𝑚(𝑇𝑛)=𝛼and 𝑝𝑙𝑖𝑚(𝑈𝑛)=𝛽, then – 𝒑lim(𝑇𝑛+𝑈𝑛)=𝑝𝑙𝑖𝑚𝑇𝑛+𝑝𝑙𝑖𝑚𝑈𝑛=𝛼+𝛽 – 𝒑lim(𝑇𝑛𝑈𝑛)=𝛼𝛽 – 𝒑lim(𝑇𝑛/𝑈𝑛)=𝛼/𝛽provided 𝛽≠0.
What is the central limit theorem?
the sample average, when standardized, has an asymptotically standard normal distribution, provided the population mean and variance exist
what is weak law of large numbers?
𝒑lim(𝑌𝑛)=𝜇
In other words, LLN states that the sample average converges in probability to its expectation.
Difference between consistency and asymptotic normality?
Consistency refers to the convergence of an estimator in probability to a single point. Asymptotic normality refers to the convergence of the distribution of an estimator to the normal distribution.
What is an estimator?
an estimator of θ is a rule that assigns each possible outcome of the sample a value of θ. It is a function of random variables. Plugging the sample into an estimator gives us an estimate.