Lecture 4 - Roots of functions Flashcards
Regularizing, Bisection, Newton's method, Secant method
Basics
x* is called a root or zero of function f if f(X*)=0
In order to use root finfing algorithms, we need to add REGULARIZING assumptions:
- Continuity - a function f is continuous if f(x1)-f(x2) vanishes, as x1->x2
- Differentiability - a function is differentiable, if its derivative f’ exists for all x
BISECTION
Easiest one, but not the fastest!
Suppose we are given as input a continuous function f(x) and two values: l and r.
f(l) and f(r) have opposite signs, so f(l) * f(r) < 0
By the intermediate value theorem somewhere between l and r there is a root.
This property suggests a bisection algorithm for finding x*: divide interval [l,r] in half recursively, each time keeping the side in which a root is known to exist
Bisection exhibits linear convergence. Although it’s slow it is guaranteed to converge to a root of any continuous function
NEWTON’S method
We start the algorithm with the initial guess x0 (possibly close to the root x)
We’re building a tangent line to x0 -> f(x1) ~ f(x0) + f’(x0)(x1-x0) -> y = f(x0)+f’(x0)*(x-x0)
Setting y = 0 => x1= x0-[ f(x0)/f’(x0) ], assuming f’(x0) !=0
Finally we obtain recursive formula: xk+1 = xk - [ f(xk)-f’(xk) ]
So basicly, we are building tangent lines for given x, root of first tangent is the base for next tangent line and so on and so forth
PROPERTIES:
- > converges if x0 is sufficiently close to x*
- > requires to provide only a single guess point
- > it guarantees quadratic convergence
Backwards?
it requires to evaluate the derivative f’, which might be computational demanding
SECANT method
In contrary to Newton’s method - it DOESN’T require to compute the derivative
f’ is approximated like so:
f’(xk1) =[ f(k1) - f(xk-1) ] / [ xk - xk-1)
Effect:
xk+1 = xk - { [f(xk)*(xk - xk-1)] / [f(xk) -f(xk-1) ] }
Convergence rate between bisection linear and Newton’s quadratic!
Starts with the interval [a, b], similarly as in a bisection method, where x0 =a and x1=b