Optomisation Flashcards
Golden Section Search: Description
Assume f is unimodal. Comparing f(x1) and f(x2) allows us to discard a subinterval (x2, b] or [a, x1) each iteration. The minimum of the function will be in the remaining interval.
Golden Section Search: Method
Algorithm involving
T = (sqrt(5)-1)/1
x1=a+(1-T)(b-a)
x2=a+T(b-a)
Golden Section Search: Robustness
Safe, slowly convergent linearly with constant C ~ 0.618.
Always reuse one of two test points, so is beneficial is computationally expensive to computer function value.
Steepest Descent: Description
Calculate local minimum by taking descending steps from a given point of a function.
Step size too small - slow descent.
Step size too large - may overshoot or diverge from minimum, or ping pong effect.
Steepest Descen: Method
xi+1 = xi-Alpha * Nablaf’(x)
Calculate Alpha. Sub into eqn.
Newton’s Method:
This is the one where you take away Nabla/Hessian from xi. Calculate by subbing in si.
xi+1 = xi - (Nablaf’(x) / Hessian(f’x))
si = xi+1 - xi
Derive Newtons Method from Taylor
Write out f(x+h) expansion Differentiate f(x+h) with respect to h. Maximum when df(x+h)/dh = 0 Remember xi+1 = xi + h
Newtons Method Multidimensional
Replace f’ & f’’ with Nabla and Hessian.
f’(x) = Nablaf(x).
f’‘(x) = Hf(x)
si = xi+1 - xi
Newton’s Method: Convergence and Robustness
Much faster than steepest descent as 2nd order (Newton’s) vs 1st order. However, computing Hessian is expensive!
Quadratic Convergence.
Not robust as must start relatively close to solution