Background 2 Flashcards

Algorithms

1
Q

Describe line search algorithms:

A
in each iteration choose a direction p_k and then search along this direction from x_k by approximately solving(to find the minimum)
min f(x_k +αp_k)
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Describe trust region methods:

A

construct a model function m_k whose behaviour near x_k is similar to the original function f.

the search for a minimizer of m_k is restricted to some region around x_k

approximate minimizer of the model in this region.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What is steepest descent direction?

What is the characteristic of this direction ?

A

steepest descent direction is the inverse of the gradient of the function f(x_k)

direction is orthogonal to the contours of the function.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What is steepest descent method?

what is its convergence rate?

A

is a line search method that moves along the search path which is negative of the gradient of the function f(x_k) at every step

converge at linear rate. when ill condition converge factor r close to 1.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Advantage and disadvantage of the steepest descent method ?

A

Advantage:
-it only requires the calculation of the gradient not the second derivative.

-negative of the gradient is guaranteed to produce a decrease in f.

Disadvantage:
-the method may converge very slowly on ill-conditioned problems(scaling) it will zigzag

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

What is Newton direction ?

A

it derived from the second-order Taylor series

Assuming the hessians ∇*2 f_k is positive definite

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Advantage and disadvantage of Newton method ? What is the convergence rate?

A
Advantage:
- avoid slow convergence on ill-conditioned problems
- fast rate of local convergence
Disadvantage: 
-the Hessian is expensive to compute. 

-converge quadratically rate

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

What is quasi-Newton direction ?

A

Use approximation of the hessians.

use the knowledge gained from the gradient at each step.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Advantage of quasi-Newton method ?

and its convergence rate ?

A
  • do not require computation of the Hessian

- still attain a super-linear rate

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Conjugate Gradient Strategies vs steepest decent ?
search direction?
convergence rates?
advantage ?

A
  • more effective than the steepest descent direction
  • each new search direction only depends on data from the previous step
  • do not attain the fast convergence rates of Newton or quasi- Newton methods
  • advantage of not requiring storage of matrices
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Golden section search

A

?

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

backtracking line search

A

?

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Trust region linear model ?

A

??

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Trust region quadratic model ?

A

??

How well did you know this?
1
Not at all
2
3
4
5
Perfectly