Unconstrained Optimization Flashcards
What is optimization?

Write out the general equation for the gradient of a function f.

Write the general equation for the Hessian of a function f

What are the general steps to find the minimum (or maximum) of a function of one variable?

What does it mean for a problem to be multimodal?
The problem has multiple minima.
What are the general steps to find the minimum (or maximum) of a function with multiple variables?

When is x* a global minimizer?

When is x* a local minimizer?

How can you put the problem (function) in standard form if you wish to maximize a function?

When is x* a (weak) local solution?

When is x* a strong local solution?

For any two points x and y, a convex function satisfies…[what is the inequality?]
(Hint: this is Dr Kennedy’s notation)

For any two points x1 and x2, a convex function satisfies…[what is the inequality?]
(Hint: this is Dr German’s notation)

There are two equivalent conditions to identify convex functions. What are they?

For a convex function, _____ optimality implies ______ optimality.

_____ _____ optimality implies _____ _____ optimality.







In Dr. Kennedy’s class, we put the function f(x) into a standard form. What is this form?

Let f(x) be a convex function. If x* is a local solution to the unconstrained minization problem, then x* is a _____ ______ to the unconstrained minimization problem.

What is an Optimization Algorithm?













What are direct search methods?

What are line search methods?

What are trust region methods?



Give the equation for a line search according to Dr. German, and explain each of the terms.

Give the steps to performing a line search, according to Dr. Kennedy

Give the steps to performing a line search, according to Dr. German







List three methods to find a search direction in a line search.
Steepest Descent
Conjugate Gradient (Conjugate Directions)
Newton’s Method (Newton Search Directions)
Quasi-Newton Method
Give an example of a zeroth-order method

Give an example of a first-order method.

Give an example of a second-order method.

Give the equation of a descent direction (either Kennedy’s or German’s notation)

Give an example of a method used to find the step size in a line search.
Polynomial Approximations
Golden Section Method
What is the equation for the first Wolfe Condition (Armijo rule, sufficient decrease).
(either Dr Kennedy’s or Dr German’s notation)

What is the equation for the first Strong Wolfe Condition (Armijo rule) (sufficient decrease)
(either Dr Kennedy’s or Dr German’s notation)

What is the equation for the second Wolfe Condition (curvature condition)
(either Dr Kennedy’s or Dr German’s notation)

What is the equation for the second Strong Wolfe Condition (curvature condition)
(either Dr Kennedy’s or Dr German’s notation)

What is the problem with the first Wolfe Condition (if we only enforce this condition)?

How does the curvature condition help prevent step sizes that are too small (Wolfe Conditions)?
Hint: what does it mean for the slope?

Give three examples of direct search algorithms.
Grid Search
Random Walk
Random Search
Compass Search
Coordinate Pattern Search
What is the problem with the second Wolfe Condition?
The second Wolfe Condition is satisfied by arbitrarily large steps
(The opposite problem of the first Wolfe Condition)
What is backtracking?
Backtracking is the process of decreasing the step size by a constant percentage.
What does backtracking do for us in terms of the Wolfe Conditions?
It can be shown that it is necessary to check only the first Wolfe Condition (Armijo rule, sufficient decrease) if we begin a line search with a “large” step size and then successively decrease it by a constant percentage, i.e., if we being a line search with a “large” step size and then backtrack.
We do not need to check the second Wolfe Condition (curvature condition).
Graphically show the Wolfe Conditions (first, second, and strong) on an arbitrary function.

Write out a generic algorithm to perform backtracking on a line search.

What are two disadvantages of using backtracking in a line search?

What is the Golden Section Method (or Golden Section Search)?

What are the steps in the Golden Section Method?





What do we use polynomial approximations for?
To estimate the step size in a line search.
What is the main concern regarding the use of steepest descent?
Steepest descent performs very poorly - very slow convergence.
We search in essentially the same direction at multiple iterations
Write out a generic algorithm to perform a line search
(Kennedy’s notation)

Write out a generic algorithm to perform steepest descent.
(Kennedy’s notation)



Search directions are always _____ whenever an exact line search is performed.
Search directions are always perpendicular whenever an exact line search is performed.
For an inexact line search, search directions are only _____ _____.
For an inexact line search, search directions are only approximately perpendicular.
What is the main idea behind a polynomial approximation? What types of polynomials are the most common?

If the Hessian is positive definite, give the equation for the Newton Search Direction
(Kennedy’s or German’s notation)

When the Hessian is positive definite, the Newton Direction is a _______ ______.
When the Hessian is positive definite, the Newton Direction is a descent direction.
Concerning Newton Search Directions, name two problems that can arise if the Hessian is NOT positive definite.

Newton’s Method displays _____ convergence.
Newton’s Method displays quadratic convergence.
What are Quasi-Newton Methods? Write an equation for Quasi-Newton Methods.

What is the BFSG Method?

Write the equation for a Newton Direction.
Write the equation for a Quasi-Newton Direction.
(German’s notation)

What does it mean for a set of vectors to be conjugate with respect to a matrix?

How many steps will it take for the conjugate method to converge?



Mutually perpendicular directions (such as the coordinate directions) are conjugate with respect to ____.
Mutually perpendicular directions (such as the coordinate directions) are conjugate with respect to the identity matrix.
What is the condition number?
What is the condition number if A is symmetric positive definite?
