Chapter 5 Flashcards

1
Q

What is the purpose of optimization?

A

To find the minimum of a scalar cost function

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What are the requirements for parameter estimation?

A
  • Observations
  • Mathematical Model for measurement
  • Assumptions about the stochastic characteristics of the observations
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What are properties of estimators and what do they mean?

A

Bias: Deviation from real value

Covariance: Dispersion of estimator around expected value

Mean Square Error: Captures the effects of both Bias and covariance

Consistency

Asymp. Normality

Asymp. Efficiency

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What is consistency in estimators?

A

If an estimator converges in probability as the number of samples increases, it is called consistent. A consistent estimator is automatically unbiased

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

What is Asymptotic normality?

A

“The more samples we use, the smaller the confidence intervals on the parameters will be, i.e. the ‘surer’ one can be about them

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

What is asymptotic efficiency?

A

“An asymptotically efficient estimator makes the most of the available data, i.e. achieves the lowest possible parameter variances of all estimators”

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

What is the basic idea of the maximum likelihood methods?

A

Adjust the parameters so the probability of obtaining a set of measurements is maximized

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

What are the asymptotic properties of the maximum likelihood estimates?

A
  1. Asymptotically consistent
  2. Asymptotically Normal
  3. Asymptotically Efficient
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

What are the Three Estimation Models?

A
  • Bayesisan
  • Fisher
  • Least Square
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

For a multivariate cost function, what are the requirements for a vector to be a minimum?

A
  1. First Derivative is 0
  2. The Hessian is positive definite
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

What is an estimator?

A

An estimator is a rule or method for calculating an estimate of a given quantity based on observed data.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

What are the parameters and residuals of the Fisher Model?

A
  • \theta is a vector of unknown but constant parameters
  • r is a random vector with probability density p(r) and covariance matrix Cov[r] = B
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

What are the parameters and residuals of the Bayesian Model?

A
  • \theta is a vector of random variables with probability density p(\theta)
  • r is a random vector with probability density p(r) and covariance matrix Cov[r] = B
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

What are the parameters and residuals of the Least-Square Model?

A
  • \theta is a vector of unknown but constant
  • r is a random vector of noise. No assumption is made about p(r)
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

What is a good way of comparing two estimators?

A

Comparing the mean square error (MSE)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

How does the Bayesian Estimator work?

A

Prob. densities of _ and
residuals p(r) assumed to be known a priori, conditional prob. density is maximized p(_|z)

17
Q

How does the Fisher Estimator work?

A

Estimator based on concept of likelihood function; Prob. to obtain measurements Z, given a set of parameters _ is maximized,

18
Q

How does the Least Squares Estimator work?

A

(Best) estimate for parameters
is obtained from (weighted) sum of squares, no assumptions about prob. density of _ and residuals p(r) are made

19
Q

What is the Bayesian Rule?

A

[p(z|\theta)p(\theta)]/ p(z) = p(\theta|z)