Direct Search Flashcards

1
Q

what is direct search methods ?

A
  • direct search methods are algorithms that do not attempt to approximate gradients
  • derivative free optimization.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Finite Differencing

A

Estimate the gradient using finite differences

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Disadvantage of Finite Differencing ?

A
  • step size problem
  • more computational costs
  • not accurate with noise
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

what is model-based derivative-free methods ?

A
  • using trust region methods to compute the step

- switch from a linear to a quadratic model once enough points evaluated and become available

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

What is coordinate search ?

A
  • cycles through the n coordinate directions

- obtaining new iterates by performing a line search along each direction in turn.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Disadvantage of coordinate search ?

Advantage ?

A
  • iterate infinitely without ever approach to a stationary point
  • slow than steepest descent method

advantage:
- no need derivative.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Pattern search

A

??-direction set.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Adv and Dis-adv of pattern search ?

A

Disadvantage:

-noise affect performance.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Explain Nelder-Mead simplex method ?

A
  • simplex of n+1 points
  • different operations of the simplex to find a better point to remove the worst point in simplex.
  • if not then shrink the simplex.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Advantage and disadvantage of the simplex method.

A

Disadvantage:

  • converge to non-stationary points
  • bad for ill conditioning problem.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

What is implicit filtering ?

A
  • steepest descent variant
  • use finite difference estimate gradient
  • good for noise and noise level decrease approach to optima.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly