Unconstrained Optimization Flashcards
What is optimization?
Write out the general equation for the gradient of a function f.
Write the general equation for the Hessian of a function f
What are the general steps to find the minimum (or maximum) of a function of one variable?
What does it mean for a problem to be multimodal?
The problem has multiple minima.
What are the general steps to find the minimum (or maximum) of a function with multiple variables?
When is x* a global minimizer?
When is x* a local minimizer?
How can you put the problem (function) in standard form if you wish to maximize a function?
When is x* a (weak) local solution?
When is x* a strong local solution?
For any two points x and y, a convex function satisfies…[what is the inequality?]
(Hint: this is Dr Kennedy’s notation)
For any two points x1 and x2, a convex function satisfies…[what is the inequality?]
(Hint: this is Dr German’s notation)
There are two equivalent conditions to identify convex functions. What are they?
For a convex function, _____ optimality implies ______ optimality.
_____ _____ optimality implies _____ _____ optimality.
In Dr. Kennedy’s class, we put the function f(x) into a standard form. What is this form?
Let f(x) be a convex function. If x* is a local solution to the unconstrained minization problem, then x* is a _____ ______ to the unconstrained minimization problem.
What is an Optimization Algorithm?
What are direct search methods?
What are line search methods?
What are trust region methods?