Lecture 23 Flashcards
Bisection method
- Continuous function interval [a,b] containing root
- evaluate mid-point m=(a+b)/2
- check signs. a=m if sign(f(m))==sign(f(b)) else b=m.
Interval reduced to half every iteration
Convergence iterative method
ek = |xk-x| (error at iteration k) converges at rate r if lim(kinf) ||ek+1||/||ek||^r = C r=1 linear r=2 quadratic 1
Bisection method convergence
Linear r=1
Bisection method operations
two function evaluations at first iteration then only one needed (f(m))
Bisection length interval, # iterations for tolerance
length = b-a/(2^k)
length <= tol then k >= log2(b-a/tol)
Newton’s method
Local convergence (sometimes does not converge towards a solution) x0 initial guess xk+1 = xk - f(xk)/f'(xk)
Newton’s method convergence
Typically quadratic r=2
Newton’s method operations
2 evaluations per iteration (function and first derivative)
Secant method
Newton’s with first derivative approximation (still local convergence)…
needs two starting guess xk, xk-1.
f’(xk) = (f(xk)-f(xk-1))/(xk - xk-1)
Secant method operations
Only one function evaluation per iteration
Secant method convergence
superlinear, r= (1+sqrt(5))/2 = 1.618 (golden ratio)
Bisection method python
import scipy.optimize as opt def f(x): return x**3 - x - 1
root = opt.bisect(f, a=1, b=2)
Newton’s method python
import scipy.optimize as opt
def f(x): return x**3 - x - 1 def fprime(x): return 3 * x**2 - 1
root = opt.newton(f, x0=1, fprime=fprime)
Secant method python
import scipy.optimize as opt
def f(x): return x**3 - x - 1
root = opt.newton(f, x0=1)
(same as Newton’s without fprime)
A step in Newton’s method can always be carried out for any smooth function and any value of the current guess xk.
False.