# Solving f(u)=0 by Newton’s Method

*The sciences, are small power; because not eminent; and therefore, not acknowledged in any man; nor are at all, but in a few; and in them, but of few things. For science is of that nature, as none can understand it to be, but such as in a good measure have attained it.*(Thomas Hobbes in Leviathan Chapter X 14.)*Arts of public use, as fortifications, making of engines, and other instruments of war; because they confer to defence, and victory, are power: and though the true mother of them, be science namely the mathematics; yet, because they are brought into the light, by hand of the artificer, they be esteemed (the midwife passing with vulgar for the mother,) as his issue.*(Thomas Hobbes in Leviathan Chapter X 15.)

We now consider a variant of time stepping for solving with faster convergence by invoking the (inverse of the) derivative , referred to as *Newton’s Method*.

Let us then start with and let be a differentiable function. Consider the following Fixed Point Iteration:

- (1)

with corresponding function

assuming that . Computing the derivative , we get

- ,

if . Thus we may expect that is small, that is that implying fast convcergence.

The iteration (1) is called *Newton’s Method* for computing a solution of the equation . Compare with the choice in the setting of the Contraction Mapping with thus instead of .

Newton’s method directly generalizes to in the form

- (Newton’s Method)

where is the inverse of the matrix (thus assuming that is non-singular).

One can show that , if the initial guess is close enough to the root, which means that the number of correct digits may double each iteration step.

# Wellposed and Illposed Roots

Suppose is an approximate solution with residual , or approximate* root *of an equation with exact root satisfying . We have for small

- ,

still assuming for simplicity . This shows that

indicating that the residual error translates to the root error with the *stability factor*

- ,

that is

In other words, if is not small so that is not large, then the root is well defined or *well posed, *while if is small so that is large, then the root is *ill posed* or not well defined.

For a wellposed root the curve crosses the -axis at with a definite slope, which makes the crossing point well determined. For an illposed root the curve is almost tangent to the -axis which makes the crossing point difficult to pin down.

# Newton’s Method Requires Good Initial Guess

Newton’s method converges very quickly towards a root, if the starting value is close enough to the root. If not, the iterations may diverge and then give rise complex fractal patterns as shown in the figure below showing big basins of convergence around roots separated by fractal boundary zones.

# Learn More

- \hyperref[chapterfixedpoint]{Fixed point iteration.}
- \hyperref[chapternewton]{Newton’s method}

# To Think About

- How to compute by solving ?

# Watch

Fractals from iterations by Newton’s method. Big basins show roots. Boundaries between basins show fractal complexity.

Next: From Residual to Root Error Previous: Solving by Time Stepping

## best membership plugins

I just like the helpful info you supply for your articles.

I’ll bookmark your weblog and test again here frequently.

I’m relatively certain I will learn plenty of new stuff right here!

Best of luck for the following!