# Solving f(u)=0 by Newton’s Method

• The sciences, are small power; because not eminent; and therefore, not acknowledged in any man; nor are at all, but in a few; and in them, but of few things. For science is of that nature, as none can understand it to be, but such as in a good measure have attained it. (Thomas Hobbes in Leviathan Chapter X 14.)
• Arts of public use, as fortifications, making of engines, and other instruments of war; because they confer to defence, and victory, are power: and though the true mother of them, be science namely the mathematics; yet, because they are brought into the light, by hand of the artificer, they be esteemed (the midwife passing with vulgar for the mother,) as his issue. (Thomas Hobbes in Leviathan Chapter X 15.)

We now consider a variant of time stepping for solving $f(u)=0$ with faster convergence by invoking the (inverse of the) derivative $f^\prime (u)$, referred to as Newton’s Method.

Let us then start with $N=1$ and let $f:R\rightarrow R$ be a differentiable function. Consider the following Fixed Point Iteration:

• $u^{n+1}=u^n-\frac{f(u^n)}{f^\prime (u^n)}=g(u^n)$     (1)

with corresponding function

• $g(u)=u-\frac{f(u)}{f^\prime (u)}$

assuming that $f^\prime (u)\neq 0$. Computing the derivative $g^\prime(u)$, we get

• $g^\prime (u)=1-\frac{f^\prime (u)}{f^\prime (u)}+\frac{f(u)f^{\prime\prime}(u)}{(f^\prime (u))^2}= 0$,

if $f(u)=0$. Thus we may expect that $\vert g^\prime (u)\vert$ is small, and s0 fast convcergence.

The iteration (1) is called Newton’s Method for computing a solution of the equation $f(u)=0$. Compare with the choice $g(u) = u - f(u)$ in the setting of the Contraction Mapping with thus $f(u)$ instead of $\frac{f(u)}{f^\prime (u)}$.

Newton’s method directly generalizes to $f: R^N\rightarrow R^N$ in the form

• $u^{n+1}=u^n-(f^\prime (u^n))^{-1}f(u^n)$     (Newton’s Method)

where $f^\prime (u^n))^{-1}$ is the inverse of the $N\times N$ matrix $f^\prime (u^n)$ (thus assuming that $f^\prime (u^n)$ is non-singular).

One can show that $\vert e^{n+1}\vert\sim \vert e^n\vert^2$, if the initial guess is close enough to the root, which means that the number of correct digits may double each iteration step.

# Wellposed and Illposed Roots

Suppose u is an approximate solution with residual $f(u)\approx 0$, or approximate root of an equation $f(u)=0$ with exact root $\bar u$ satisfying $f(\bar u)=0$. We have for small $\vert u-\bar u\vert$

• $f(u)-f(\bar u)\approx f^\prime (u)(u -\bar u)$,

still assuming for simplicity $N=1$. This shows that

• $\vert u-\bar u\vert \approx \frac{\vert f(u)\vert}{\vert f^\prime (u)\vert}$

indicating that the residual error $\vert f(u)\vert$ translates to the root error $\vert u-\bar u\vert$ with the stability factor

• $S =\frac{1}{\vert f^\prime (u)\vert}$,

that is

• $\vert u-\bar u\vert \approx S\vert f(u)\vert$

In other words, if $\vert f^\prime (u)\vert$ is not small so that S is not large, then the root is well defined or well posed, while if $\vert f^\prime (u)\vert$ is small so that S is large, then the root is ill posed or not well defined.

For a wellposed root u the curve $x\rightarrow f(x)$ crosses the x-axis at x=u with a definite slope, which makes the crossing point well determined. For an illposed root the curve is almost tangent to the x-axis which makes the crossing point difficult to pin down.

# Newton’s Method Requires Good Initial Guess

Newton’s method converges very quickly towards a root, if the starting value is close enough to the root. If not, the iterations may diverge and then give rise complex fractal patterns as shown in the figure below showing big basins of convergence around roots separated by fractal boundary zones.

• \hyperref[chapterfixedpoint]{Fixed point iteration.}
• \hyperref[chapternewton]{Newton’s method}

• How to compute $\sqrt{2}$ by solving $x^2=2$?