Contraction mapping principle

The contraction mapping principle, also known as Banach's fixed-point theorem, is a very interesting and useful theorem. It says: Given a "contraction" map \(F: S \to S\) where \(S\) is a closed set, we have a unique solution \(x \in S\) to the equation \[x = F(x).\]
Let's prove it. But before that, we need some preparation.

Accompanying video:



Definition (Norm)

Let \(X\) be a vector space over the field \(K\). The function \(\| \cdot \|: X \to \mathbb{R}\) is said to be a norm if it satisfies the following axioms:

  1. For all \(u \in X\), \(\|u \| \geq 0\). In particular, \[\|u \| = 0 \iff u = 0.\]
  2. For all \(\alpha \in K\) and \(u \in X\), \[\|\alpha u \| = |\alpha|\|u \|.\]
  3. (Triangle inequality) For all \(u, v\in X\), \[\|u + v\| \leq \|u\| + \|v\|.\]

Example. For \(x = (x_1, x_2, \cdots, x_n) \in \mathbb{R}^n\), 
\[\|x\| = \sqrt{x_1^2 + x_2^2 + \cdots + x_n^2}\]
is a norm. This norm is called the Euclidean norm or \(L^2\) norm.□

Example. For \(x = (x_1, x_2, \cdots, x_n) \in \mathbb{R}^n\), 
\[\|x\|_1 = |x_1| + |x_2 | + \cdots + |x_n|\]
is a norm (the \(L^1\) norm). □

Definition (Normed space)

A set on which a norm is defined is called a normed space.

Example. \(\mathbb{R}^n\) with the Euclidean norm is a normed space called the (\(n\)-dimensional) Euclidean space. (Actually, we also need to define the Euclidean distance induced by the Euclidean norm: \(d(x,y) = \|x - y\|\).) □


Definition (Complete space)

A space \(X\) in which every Cauchy sequence converges is said to be complete.

Example. The Euclidean space is complete.

Definition (Banach space)

A complete normed space is called a Banach space.

Lemma 1 [Absolutely converging series and completeness]

Let \(X\) be a Banach space. The series

\[\sum_{n=1}^{\infty}x_n = x_1 + x_2 + \cdots + x_n + \cdots\]

converges in \(X\) if

\[\sum_{n=1}^{\infty}\|x_n\| < +\infty.\tag{Eq:AbsConv}\]

Proof. Let

\[S_n = x_1 + x_2 +\cdots + x_n.\]

Suppose \(m > n\), then, by the triangle inequality and the above assumption (Eq:AbsConv),

\[\|S_m - S_n\| = \left\|\sum_{k=n+1}^{m}x_k\right\| \leq \sum_{k=n+1}^{m}\|x_k\| \to 0\]

as \(m, n \to \infty\). Thus, \(S_n\) is a Cauchy sequence. Since \(X\) is complete, \(S_n\) converges in \(X\). ■

Definition (Fixed point)

Let \(F\) be a mapping with its domain and codomain being subsets of \(X\) (i.e., \(\text{dom} F, \text{cod}F \subset X\)). A point \(x \in X\) is said to be a fixed point of \(F\) if

\[x = F(x).\]

Definition (Contraction)

Let \(X\) be a normed space. A mapping \(F: \text{dom} F \to \text{cod} F\) (\(\text{dom} F, \text{cod} F \subset X\)) is said to be a contraction mapping or contraction if there exists a constant \(r \in [0, 1)\) such that, for all \(x_1, x_2 \in \text{dom} F\),

\[\|F(x_1) - F(x_2)\| \leq r\|x_1 - x_2\|.\]


Lemma 2

A contraction is continuous.

Proof. For any \(\varepsilon > 0\), if \(\|x - x'\| < \varepsilon\), then

\[\|F(x) - F(x')\| \leq r\|x - x'\| < r\varepsilon < \varepsilon\]

since \(0 \leq r < 1\). ■

Theorem (Contraction mapping principle)

Let \(X\) be a Banach space and \(S\) be a nonempty closed subset of \(X\). If a mapping \(F: S \to S\) is a contraction mapping, then there exists a unique fixed point of \(F\) in \(S\).

Proof. First, we prove the uniqueness of the fixed point. Let \(x, x' \in S\) be fixed points of \(F\). That is, \(x = F(x)\) and \(x' = F(x')\). Thus,

\[\|x - x'\| = \|F(x) - F(x')\|.\]

Since \(F\) is a contraction, there exists an \(r \in [0, 1)\) such that

\[\|F(x) - F(x')\| \leq r\|x - x'\|.\]

Therefore, we have

\[\|x - x'\| \leq r\|x - x'\|,\]

and hence,

\[0 \leq (1 - r)\|x - x'\| \leq 0.\]

Since \(1 - r > 0\), we have \(\|x - x'\| = 0\). Thus, \(x = x'\).


Next, we prove the existence of the fixed point.  We construct the solution iteratively. First, take an arbitrary \(x_0\in S\), and define \(x_n (n = 1, 2, \cdots)\) by the following recurrence equation:

\[x_n = F(x_{n-1})\tag{Eq:rec}\]

for \(n = 1, 2, \cdots\). Clearly, \(x_n \in S\) for all \(n \in \mathbb{N}\). Next, we have

\[\begin{eqnarray} \|x_{n+1} - x_{n}\| &\leq& r \|F(x_n) - F(x_{n-1})\| = r\|x_{n} - x_{n-1}\|\\ &\leq& r^2\|x_{n-1} - x_{n-2}\|\\ &\vdots &\\ &\leq& r^n\|x_1 - x_0\|.\tag{Eq:xs} \end{eqnarray}\]

Note that 

\[\begin{eqnarray}x_{n+1} &=& (x_{n+1} - x_n) + (x_n - x_{n-1}) + \cdots + (x_1 - x_0) + x_0 \\ &=& x_0 + \sum_{i = 0}^{n}(x_{i+1} - x_{i}).\tag{Eq:XSeries}\end{eqnarray}\]

From (Eq:xs), we have

\[\begin{eqnarray} \|x_0\|+\sum_{n=0}^{\infty}\|x_{n+1}-x_{n}\| &\leq& \|x_0\| + \sum_{n=0}^{\infty}r^{n}\|x_1 - x_0\|\\ &=& \|x_0\| + \frac{\|x_1 - x_0\|}{1 - r} < \infty \end{eqnarray}\]

as \(0 \leq r < 1\). Thus, the series on the right-hand side of (Eq:XSeries) is absolutely converging; hence it converges by the above Lemma 1. But, by the definition of the series (Eq:XSeries), this means that the sequence \(\{x_n\}\) converges, that is, \(x^{*} = \lim_{n\to \infty}x_n\) exists. Finally, note that a contraction is continuous. Thus, both sides of (Eq:rec) converge to \(x^{*}\) so that

\[x^{*} = F(x^{*})\]

as \(n \to \infty\). ■


Comments

Popular posts from this blog

Open sets and closed sets in \(\mathbb{R}^n\)

Euclidean spaces

Newton's method