Chapter 6 Flashcards
What are the for types of minima?
- Global Minimum
- Strict Global Minimum
- Local Minimum
- Strict Local Minimum
What are the three types of cost functions?
- Linear
- Quadratic
- Non-Linear
What are the four types of constraints?
- Unconstrained
- Non-Linear Equality Constr.
- Non-Linear Inequality. Constr.
- Box Constraints
What is the feasible set in optimization?
The region where possible solutions may be found
What is the Newton Raphson Algorithm?
An iterative algorithm in optimization that solves the problem of non-linear equation resulting from the minimum gradient condition
What are limitations of the Newton Raphson Algorithm?
- Assumes convexity of cost function
- Strongly dependent on starting values
What are possible remedies to limitations of Newton Raphson?
Step size control and BFGS Hessian approximation
What is output sensitivity?
The measure for the influence of a parameter on the model output
What are the Pros and Cons of Sensitivity Equations?
Pros:
- Exact, no mistakes
- Relatively fast
Cons:
- All analytical derivatives of mode have to be computed
- No differentiable functions
- Not flexible with change in model structure
- Decrease in computational time might be negligible for complex, non-linear system
What are the pros and cons of finite differences?
Pros:
- Flexible
- Easy to implement since no analytic derivatives need to be calculated
Cons:
- Inexact due to round off errors
- Big computational burden in case of many parameters
- Bad choice of perturbation error can deteriorate accuracy
What are the basic steps for implementing maximum likelihood methods?(5)
- Choose starting value
- Compute system response
- Compute residuals and residual covariance matrix
- Compute parameter update with one of the non-linear optimization algorithms
- Iterate until convergance
What are three methods for computing parameter updates for maximum likelihood methods?
Output error method
Filter error method
Equation error method
What two things are used to maximize the maximum likelihood function?
Parameters and Residual Covariance Matrix
What is the iterative two step approach for maximizing the maximum likelihood function?
Optimize with parameters fixed
Optimize with residual covariance matrix fixed
What are the two approaches for computing output sensivtivities?
- Finite difference approximations
- Solutions to Sensitivity Equations
How are solutions to non-linear model equations computed?
Multi-step integration methods
What are the assumptions of the output error method?
- No process noise
- Only additive, independent gaussian measurement noise
- inputs are measurable without error, exogenous i.e. independent of system outputs ((no feedback controller!)) and sufficiently and adequately varied to excite the various modes of the dynamical system
What kind of noise does the output error method consider?
additive, white, and Gaussian measurement noise
What can be problematic for output error methods>
correlated parameters
data with significant process noise
What are the assumptions for the equation error method?
- inputs measurable without error
- all states measurable without error
- All state derivatives measurable with additive measurement noise
What are the properties of system for equation error method?
It is NOT necessary to integrate any state equations; the problem becomes static!