Newton's method

Newton's method (the Newton-Raphson method) is a very powerful numerical method for solving nonlinear equations. 



Suppose we'd like to solve a non-linear equation \(f(x) = 0\), where \(f(x)\) is a (twice) differentiable non-linear function. Newton's method generates a sequence of numbers \(c_1, c_2, c_3, \cdots\) that converges to a solution of the equation. That is, if \(\alpha\) is a solution (i.e., \(f(\alpha) = 0\)) then, 

\[\lim_{n\to\infty}c_n = \alpha,\]

and this sequence \(\{c_n\}\) is generated by a series of linear approximations of the function \(f(x)\).

Theorem (Newton's method)

Let \(f(x)\) be a function that is twice differentiable on an open interval \(I\) that contains the closed interval \([a, b]\) (i.e., \([a,b]\subset I\)) and satisfy the following conditions:

  1. \(f(a) < 0\) and \(f(b) > 0\);
  2. For all \(x\in [a, b]\), \(f'(x) > 0\) and \(f''(x) > 0\).
Let us define the sequence \(\{c_n\}\) by
\[ \begin{eqnarray} c_1 &=& b,\\ c_{n+1} &=& c_n - \frac{f(c_n)}{f'(c_n)}.\tag{eq:newton} \end{eqnarray} \]
Then,
\[\lim_{n\to \infty}c_n = \alpha\]
where \(\alpha\) is the unique solution of \(f(\alpha) = 0\).
Proof. Since \(f(x)\) is (twice) differentiable, it is continuous on \([a,b]\). Since \(f(a) < 0\) and \(f(b) > 0\), clearly \(f(a) \neq f(b)\). Therefore, by the Intermediate Value Theorem, there exists an \(\alpha \in [a,b]\) such that \(f(\alpha) = 0\). Moreover, since \(f'(x) > 0\), \(f(x)\) is strictly monotone increasing on \([a,b]\), this \(\alpha\) is unique (i.e., there is only one such \(\alpha\) that satisfies \(f(\alpha) = 0\)).
[See also: Continuity of a function for the Intermediate Value Theorem.] 
Let us pick any \(c\in [a, b]\) and find the tangent line at \(x = c\), which is given by
\[y = f'(c)(x - c) + f(c).\]
The \(x\)-intercept \(x=c'\) of this line is obtained by
\[0 = f'(c)(c' - c) + f(c),\]
and hence
\[c' = c - \frac{f(c)}{f'(c)}.\]
Therefore if \(c = c_n\), we have \(c' = c_{n+1}\).
Moreover, since \(y=f(x)\) is convex (because \(f''(x) > 0\)), the tangent line is always resides below \(y=f(x)\) on \([a,b]\) so we have \(c' > \alpha\). Since \(f'(x) > 0\) and \(f(x) > 0\) (when \(x > \alpha\)), we have
\[c_{n+1} = c_n -\frac{f(c_n)}{f'(c_n)} < c_n.\]
Thus, \(\{c_n\}\) is a strictly monotone decreasing sequence bounded below. By the Monotone Convergence Theorem, \(\{c_n\}\) converges to a certain value. [See also: Monotone sequences and Cauchy sequences for the Monotone Convergence Theorem.] 
Let \(\beta = \lim_{n\to\infty}c_n\), then \(\alpha \leq \beta < b\). If we take the limit \(n \to \infty\) on both sides of Eq. (eq:newton), we have
\[\beta = \beta - \frac{f(\beta)}{f'(\beta)},\]
from which we conclude
\[f(\beta) = 0.\]
However, \(\alpha\) is the only solution to \(f(x) = 0\). Hence \(\beta = \alpha\). ■

Example. Let \(d\) be any positive real number. Using Newton's method, let us find an approximation of \(\sqrt{d}\). First, let us define \(f(x) = x^2 - d\). Then the solution to \(f(x) = 0\) is \(x=\sqrt{d}\) (provided \(x \geq 0\)). \(f(0) = -d < 0\), and \(f'(x) = 2x > 0\) for \(x > 0\), and \(f''(0) = 2 > 0\). By choosing sufficiently small \(a\) and sufficiently large \(b\), \(f(x)\) can satisfy the conditions required for Newton's method. If we define the sequence \(\{c_n\}\) by
\[ \begin{eqnarray} c_1 &=& b,\\ c_{n+1} &=& c_n - \frac{c_n^2 - d}{2c_n} = \frac{1}{2}\left(c_n + \frac{d}{c_n}\right), \end{eqnarray} \]
we have \(\lim_{n\to\infty}c_n = \sqrt{d}\). □

Comments

Popular posts from this blog

Open sets and closed sets in \(\mathbb{R}^n\)

Euclidean spaces

Applications of multiple integrals