Estimation and MLE Flashcards
To estimate, what characteristics do the r.v.s. need to have?
- Observations are identically distributed if they all have the same probability distribution f(x; θ) (with the same parameter).
- The observations are independent if the joint distribution can be written f(X1, …, Xn) = f(x1)· … · f(xn) - in practice this means that information about the value of one observation does not influence the probability distribution of the others.
If observations are i.i.d they are a random sample. All thought possible observations are called the population. The typical situation is that we based on the information in a random sample, X1, X2, …, Xn, wish to say something about the whole population, for instance by estimating the parameter θ.
Three common estimators:
When is an estimator unbiased?
What do we wish that an estimator should fulfill?
MLE
- Define the likelihood function:
- Remember to ()^n on other variables - Take ln():
- Take the derivative w.r.t. θ:
- Set equal to zero and solve w.r.t. θ:
- Check that l’‘(θ) < 0 for θ = θ^
In step 1, remember to put sum and product sum on all relevant variables. If it is in exp() it is often normal sum.
Uncertinaty in estimates
A (point-)estimator θ̂ gives an estimate of an unknown parameter value, but does not give any direct information about the uncertainty in the estimate.
Can we assume that the estimated value is very close to the true value, or is there reason to fear that it may be far off?
We can say something about this for instance by making confidence intervals, but to construct confidence intervals we need to know the distribution of the estimator.
To find the distribution of different estimators (which always are functions of random variables) we need to know something about the distribution of functions of random variables.