Solve f(x)=0 by Time Stepping x = x+f(x)*dt

An equation f(x)=0 with f(x) a given function of a variable x taking rational values can be solved by time stepping:

  • x = x + f(x)*dt                (1)

where the sign of f(x) is chosen so that \frac{df}{dx} <0 and the time step dt is small enough. We start with x=x_0 as a start approximation and compute successively:

  • x_{n+1} = x_n + f(x_n)*dt for n=0, 1, 2,...

If the sequence x_n converges to a value X so that x_{n+1}-x_n=f(x_n)*dt gets small as n increases, also f(x_n) becomes small and so we will have f(X)=0 (assuming f(x) is continuous) and so we have computed a root X to the equation f(x)=0.

As an example consider f(x)=2-x^2 with x_0=1 and dt=0.1.

See that X =\sqrt{2}.

To understand the condition \frac{df}{dx} <0 and dt small enough, subtract two successive steps to get

  • x_{n+1}-x_{n} = x_n -x_{n-1} +(f(x_n)-f(x_{n-1})*dt\approx (x_n-x_{n-1})(1+\frac{df}{dx}(x_n)*dt)

where convergence with

  • \vert x_{n+1}-x_{n}\vert \approx q\vert x_{n}-x_{n-1}\vert

is guaranteed if

  • q=\vert 1+\frac{df}{dx}(x_n)*dt\vert < 1

with quicker convergence the smaller q is. Explain.

Play with the above code and compute roots.

Compare with Solving Equations by Newton’s Method

Compare with Fixed Point Iteration

Generalisation to Newton’s Method

Consider time stepping with the following variant of (1):

  • x = x + alpha*f(x)*dt,         (2)

where alpha is a number to be chosen according to the following consideration:

Rewrite (2) as fixed point iteration

  • x = g(x)  with g(x) = x + alpha*f(x)

with convergence if \vert Dg(x)\vert < 1 for all x, where Dg = dg/dx and quicker convergence the smaller vert Dg(x)\vert| is. We compute

  • Dg(x) = 1 + alpha*Df(x)

to find that Dg(x) is small (0) if

  • alpha = -\frac{1}{Df(x)}.

We are thus led to time step as follows choosing dt = 1,

  • x = x - \frac{f(x)}{Df(x)}

which is Newton’s Method with very quick convergence (assuming Df(x) not zero).

See Solving f(x) = 0 by Newton’s Method.