The Newton-Raphson Method is perhaps the simplest and most efficient method of root-finding except for one compromise: it requires a reasonably accurate initial estimate of the root.

Although this method may seem similar to the secant method as it uses straight lines to approximate the function, the way it derives the root is very different – it uses the tangent at the estimate instead of the secant, relying on the current estimate and its derivative function rather than an interval.

This algorithm produces successively better approximations to the roots (or zeroes) of a real-valued function using Newton’s method, or Newton-Raphson method, named after Isaac Newton and Joseph Raphson.

A system of equations can be solved efficiently using this method. Furthermore, we can show that the method is quadratically convergent when approaching the root. Using the Newton Raphson method and its geometric interpretation, you will learn how to find the roots or solutions of an equation.

The math

For an equation 𝑓(𝑥) = 0, we choose 𝑥𝑖 as an initial approximation of the root or the estimate from the last iteration. Then through the point $$(x_i, f(x_i))$$ to get the tangent L of the curve.

$$L: y = f(x_i) + f(x_i)'(x-x_i)$$

Hence, the next iteration can be found by:

$$x_{i+1}=x_i-\frac{f(x_i)}{f(x_i)’}$$

Example Figure 6.1: Graph of $$f(x)=sin(x)+\frac{x}{10}$$ (in red) and tangent of graph at point $$(4, f(4))$$

Above shows the graph of the function $$f(x)=sin(x)+\frac{x}{10}$$.

The derivative of the function can be calculated as such:

$$f(x)’=\frac{d}{dx}f(x)$$

$$=\frac{d}{dx}sin(x)+\frac{x}{10}$$

$$=cos(x)+\frac{1}{10}$$

Thus, the tangent to the point is:

$$y=sin(4)+4/10+(cos(4)+1/10)(x-4)$$

				
import kotlin.math.sin as sin
import kotlin.math.cos as cos

fun testFunction(x: Double):Double{
return sin(x)+x/10
}

fun testFunctionDX(x: Double):Double{
return cos(x)+1/10
}

fun NewtonRaphson(myFunction: (x: Double) -&gt; Double, myFunctionDX: (x: Double) -&gt; Double, x: Double, maxIterations: Int): Double{
var x: Double = x
var iterations: Int = 0
while (iterations&lt;maxIterations){
var next_x = x - myFunction(x)/myFunctionDX(x)
print(next_x)
print(&quot;\n&quot;)
x = next_x
iterations+=1
}
return x
}

NewtonRaphson(::testFunction, ::testFunctionDX, 4.0, 10)



Output:

3.454132980236982
3.4940007632328425
3.4985196261168254
3.499005684799097
3.499057613598004
3.49906315739022
3.499063749185194
3.499063812358257
3.4990638191018633
3.4990638198217305

Evidently, this method converges very quickly as well, reaching a steady state with 4 digits of accuracy after just 4 iterations.

Implementation

In NM Dev, the class NewtonRoot implements the Newton-Raphson root-finding algorithm, together with a solve function.

Note that the second solve function allows you to supply the derivative function 𝑓’ for 𝑓. Otherwise, a derivative function computed using finite differencing is automatically
generated by the code (though slower).

				
UnivariateRealFunction f = new AbstractUnivariateRealFunction() {
@Override
public double evaluate(double x) {
return return x * x + 4 * x - 5; // x^2 +4x - 5 = 0
}
};
UnivariateRealFunction df = new AbstractUnivariateRealFunction() {
@Override
public double evaluate(double x) {
return 2 * x + 4; // 2x + 4
}
};
NewtonRoot solver = new NewtonRoot(1e-8, 5);
double root = solver.solve(f, df, 5.);
double fx = f.evaluate(root);
System.out.println(String.format("f(%f) = %f", root, fx));



Output:

f(1.000000) = 0.000000

We can see from the iteration results below that the algorithm already reaches very good result in the 4th iteration, hence very good efficiency.

 Iterations i $$x_i$$ Error 0 5 1 2.142857143 1.33E+00 2 1.157635468 8.51E-01 3 1.003934739 1.53E-01 4 1.000002577 3.93E-03 5 1.000000000001107 2.58E-06 6 0.9999999999999999 1.11E-12

One of the limitation of Newton’s method is that it may not work if there are points of inflection, local maxima or minima around x0 or the root.

## Halley's Method

For functions of one real variable with a continuous second derivative, Halley’s method is a root-finding algorithm. Edmond Halley invented it.

After Newton’s method, this algorithm is the second in Householder’s methods. In a similar manner to the latter, it iteratively approximates the root, with cubic convergence rates. This method is available in multidimensional versions.

It is superior to Newton’s or Secant methods for estimating linear-over-linear Padé approximations to functions, or Muller’s method for estimating quadratically an approximation to functions, that find the roots exactly.

Formulated by Edmund Halley, this method utilises both the first and second derivatives of the function to find the root.

The formula for each iteration under Halley’s Method is:

$$x_{i+1}=x_i-\frac{2f(x_i)f'(x_i)}{2[f'(x_i)]^2-f(x_i)f”(x_i)}$$

Implementation

Similar to the Newton-Raphson method, Halley’s Method can be optionally supplied with the first and second derivatives of the function.

				
UnivariateRealFunction f = new AbstractUnivariateRealFunction() {
@Override
public double evaluate(double x) {
return x * x + 4 * x - 5; // x^2 +4x - 5 = 0
}
};
UnivariateRealFunction df = new AbstractUnivariateRealFunction() {
@Override
public double evaluate(double x) {
return 2 * x + 4; // 2x + 4
}
};
UnivariateRealFunction d2f = new AbstractUnivariateRealFunction() {
@Override
public double evaluate(double x) {
return 2; // 2
}
};
HalleyRoot solver = new HalleyRoot(1e-8, 3);
double root = solver.solve(f, df, d2f, 5.);
double fx = f.evaluate(root);
System.out.println(String.format("f(%f) = %f", root, fx));



Output:

$$f(1.000000) = 0.000000$$

The method converges in just 3 iterations.