Unconstrained Optimization Flashcards

1
Q

What is optimization?

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Write out the general equation for the gradient of a function f.

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Write the general equation for the Hessian of a function f

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What are the general steps to find the minimum (or maximum) of a function of one variable?

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

What does it mean for a problem to be multimodal?

A

The problem has multiple minima.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

What are the general steps to find the minimum (or maximum) of a function with multiple variables?

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

When is x* a global minimizer?

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

When is x* a local minimizer?

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

How can you put the problem (function) in standard form if you wish to maximize a function?

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

When is x* a (weak) local solution?

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

When is x* a strong local solution?

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

For any two points x and y, a convex function satisfies…[what is the inequality?]

(Hint: this is Dr Kennedy’s notation)

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

For any two points x1 and x2, a convex function satisfies…[what is the inequality?]

(Hint: this is Dr German’s notation)

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

There are two equivalent conditions to identify convex functions. What are they?

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

For a convex function, _____ optimality implies ______ optimality.

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

_____ _____ optimality implies _____ _____ optimality.

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q
A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q
A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q
A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
20
Q

In Dr. Kennedy’s class, we put the function f(x) into a standard form. What is this form?

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
21
Q

Let f(x) be a convex function. If x* is a local solution to the unconstrained minization problem, then x* is a _____ ______ to the unconstrained minimization problem.

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
22
Q

What is an Optimization Algorithm?

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
23
Q
A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
24
Q
A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
25
26
27
28
29
What are direct search methods?
30
What are line search methods?
31
What are trust region methods?
32
33
Give the equation for a line search according to Dr. German, and explain each of the terms.
34
Give the steps to performing a line search, according to Dr. Kennedy
35
Give the steps to performing a line search, according to Dr. German
36
37
38
39
List three methods to find a search direction in a line search.
Steepest Descent Conjugate Gradient (Conjugate Directions) Newton's Method (Newton Search Directions) Quasi-Newton Method
40
Give an example of a zeroth-order method
41
Give an example of a first-order method.
42
Give an example of a second-order method.
43
Give the equation of a descent direction (either Kennedy's or German's notation)
44
Give an example of a method used to find the step size in a line search.
Polynomial Approximations Golden Section Method
45
What is the equation for the first Wolfe Condition (Armijo rule*,* sufficient decrease). (either Dr Kennedy's or Dr German's notation)
46
What is the equation for the first Strong Wolfe Condition (Armijo rule) (sufficient decrease) (either Dr Kennedy's or Dr German's notation)
47
What is the equation for the second Wolfe Condition (curvature condition) (either Dr Kennedy's or Dr German's notation)
48
What is the equation for the second Strong Wolfe Condition (curvature condition) (either Dr Kennedy's or Dr German's notation)
49
What is the problem with the first Wolfe Condition (if we only enforce this condition)?
50
How does the curvature condition help prevent step sizes that are too small (Wolfe Conditions)? ## Footnote *Hint: what does it mean for the slope?*
51
Give three examples of direct search algorithms.
Grid Search Random Walk Random Search Compass Search Coordinate Pattern Search
52
What is the problem with the second Wolfe Condition?
The second Wolfe Condition is satisfied by arbitrarily large steps (The opposite problem of the first Wolfe Condition)
53
What is backtracking?
Backtracking is the process of decreasing the step size by a constant percentage.
54
What does backtracking do for us in terms of the Wolfe Conditions?
It can be shown that it is necessary to check only the first Wolfe Condition (Armijo rule, sufficient decrease) if we begin a line search with a "large" step size and then successively decrease it by a constant percentage, i.e., if we being a line search with a "large" step size and then backtrack. We do not need to check the second Wolfe Condition (curvature condition).
55
Graphically show the Wolfe Conditions (first, second, and strong) on an arbitrary function.
56
Write out a generic algorithm to perform backtracking on a line search.
57
What are two disadvantages of using backtracking in a line search?
58
What is the Golden Section Method (or Golden Section Search)?
59
What are the steps in the Golden Section Method?
60
61
62
What do we use polynomial approximations for?
To estimate the step size in a line search.
63
What is the main concern regarding the use of steepest descent?
Steepest descent performs very poorly - very slow convergence. We search in essentially the same direction at multiple iterations
64
Write out a generic algorithm to perform a line search (Kennedy's notation)
65
Write out a generic algorithm to perform steepest descent. (Kennedy's notation)
66
67
Search directions are always _____ whenever an exact line search is performed.
Search directions are always perpendicular whenever an exact line search is performed.
68
For an inexact line search, search directions are only _____ \_\_\_\_\_.
For an inexact line search, search directions are only approximately perpendicular.
69
What is the main idea behind a polynomial approximation? What types of polynomials are the most common?
70
If the Hessian is positive definite, give the equation for the Newton Search Direction (Kennedy's or German's notation)
71
When the Hessian is positive definite, the Newton Direction is a _______ \_\_\_\_\_\_.
When the Hessian is positive definite, the Newton Direction is a descent direction.
72
Concerning Newton Search Directions, name two problems that can arise if the Hessian is NOT positive definite.
73
Newton's Method displays _____ convergence.
Newton's Method displays quadratic convergence.
74
What are Quasi-Newton Methods? Write an equation for Quasi-Newton Methods.
75
What is the BFSG Method?
76
Write the equation for a Newton Direction. Write the equation for a Quasi-Newton Direction. (German's notation)
77
What does it mean for a set of vectors to be conjugate with respect to a matrix?
78
How many steps will it take for the conjugate method to converge?
79
80
Mutually perpendicular directions (such as the coordinate directions) are conjugate with respect to \_\_\_\_.
Mutually perpendicular directions (such as the coordinate directions) are conjugate with respect to the identity matrix.
81
What is the condition number? What is the condition number if A is symmetric positive definite?
82