Optomisation Flashcards

1
Q

Golden Section Search: Description

A

Assume f is unimodal. Comparing f(x1) and f(x2) allows us to discard a subinterval (x2, b] or [a, x1) each iteration. The minimum of the function will be in the remaining interval.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Golden Section Search: Method

A

Algorithm involving
T = (sqrt(5)-1)/1
x1=a+(1-T)(b-a)
x2=a+T(b-a)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Golden Section Search: Robustness

A

Safe, slowly convergent linearly with constant C ~ 0.618.

Always reuse one of two test points, so is beneficial is computationally expensive to computer function value.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Steepest Descent: Description

A

Calculate local minimum by taking descending steps from a given point of a function.
Step size too small - slow descent.
Step size too large - may overshoot or diverge from minimum, or ping pong effect.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Steepest Descen: Method

A

xi+1 = xi-Alpha * Nablaf’(x)

Calculate Alpha. Sub into eqn.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Newton’s Method:

A

This is the one where you take away Nabla/Hessian from xi. Calculate by subbing in si.
xi+1 = xi - (Nablaf’(x) / Hessian(f’x))
si = xi+1 - xi

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Derive Newtons Method from Taylor

A
Write out f(x+h) expansion
Differentiate f(x+h) with respect to h.
Maximum when df(x+h)/dh = 0
Remember xi+1 = xi + h
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Newtons Method Multidimensional

A

Replace f’ & f’’ with Nabla and Hessian.
f’(x) = Nablaf(x).
f’‘(x) = Hf(x)
si = xi+1 - xi

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Newton’s Method: Convergence and Robustness

A

Much faster than steepest descent as 2nd order (Newton’s) vs 1st order. However, computing Hessian is expensive!
Quadratic Convergence.
Not robust as must start relatively close to solution

How well did you know this?
1
Not at all
2
3
4
5
Perfectly