13.0 Solving an Equation in One Variable

Suppose we have an equation of the form f(x) = 0, where f is some, perhaps ugly standard differentiable function on some interval. For example, f(x) might be sin(x) - 0.2, or exp (sin(x)) – csc(x), or anything else you might want to investigate.

We address the question: how can we find a solution, that is, a value of x for which the statement f(x ) = 0 is true, to within the accuracy of our computations?

We will explore four methods, which we describe here in one sentence each; we then examine them in further detail one at a time.

The first, which is called Newton's method, or the Newton-Raphson method, involves guessing an answer, x0, then assuming that that answer is incorrect, solving the equation that the linear approximation to f at x0 is 0. This is a linear equation whose solution is easy. If we name the solution point to it x1, we can repeat this step, that is, solve the equation that states that the linear approximation to f at x1 is 0, to find x2, and so on.

We also consider a slightly different method which we will call "Poor Man's Newton" in which instead of evaluating the derivatives needed to form the linear approximation by formal differentiation, we approximate them numerically. Its only virtue is that you need not differentiate f in order to apply it.

The third method involves guessing two points x1 and x2, finding f(x1) and f(x2) and creating a new guess at the point where the straight line through these crosses the x axis.

The final method, sometimes called "divide and conquer" involves starting with two points x1 and x2, at which f takes on values having opposite signs. Then we can evaluate f half way between them and find an interval half the size of x2 - x1 in which again f takes on values having opposite signs. Repeating this step will home in on a solution, so long as f is continuous, that is, has no gaps.

Not all equations have solutions so all of these methods must fail for some equations. The first three can produce sequences of guesses, x1, x2, ... which flail about and do not converge.

It may be impossible to get started in the last method, since you may not be able to find x1 and x2 at which f has opposite signs. It is a slow and steady method, like the tortoise's racing plan, but it must win once started and improves its accuracy by a factor of two on each iteration.