Chapter 5 Flashcards
What is the purpose of optimization?
To find the minimum of a scalar cost function
What are the requirements for parameter estimation?
- Observations
- Mathematical Model for measurement
- Assumptions about the stochastic characteristics of the observations
What are properties of estimators and what do they mean?
Bias: Deviation from real value
Covariance: Dispersion of estimator around expected value
Mean Square Error: Captures the effects of both Bias and covariance
Consistency
Asymp. Normality
Asymp. Efficiency
What is consistency in estimators?
If an estimator converges in probability as the number of samples increases, it is called consistent. A consistent estimator is automatically unbiased
What is Asymptotic normality?
“The more samples we use, the smaller the confidence intervals on the parameters will be, i.e. the ‘surer’ one can be about them
What is asymptotic efficiency?
“An asymptotically efficient estimator makes the most of the available data, i.e. achieves the lowest possible parameter variances of all estimators”
What is the basic idea of the maximum likelihood methods?
Adjust the parameters so the probability of obtaining a set of measurements is maximized
What are the asymptotic properties of the maximum likelihood estimates?
- Asymptotically consistent
- Asymptotically Normal
- Asymptotically Efficient
What are the Three Estimation Models?
- Bayesisan
- Fisher
- Least Square
For a multivariate cost function, what are the requirements for a vector to be a minimum?
- First Derivative is 0
- The Hessian is positive definite
What is an estimator?
An estimator is a rule or method for calculating an estimate of a given quantity based on observed data.
What are the parameters and residuals of the Fisher Model?
- \theta is a vector of unknown but constant parameters
- r is a random vector with probability density p(r) and covariance matrix Cov[r] = B
What are the parameters and residuals of the Bayesian Model?
- \theta is a vector of random variables with probability density p(\theta)
- r is a random vector with probability density p(r) and covariance matrix Cov[r] = B
What are the parameters and residuals of the Least-Square Model?
- \theta is a vector of unknown but constant
- r is a random vector of noise. No assumption is made about p(r)
What is a good way of comparing two estimators?
Comparing the mean square error (MSE)