A continuous function on a closed interval is uniformly continuous

The notion of uniform continuity is a ``stronger'' version of (simple) continuity. If a function is uniformly continuous, it is continuous, but the converse does not generally hold (that is, a continuous function may not be uniformly continuous). However, if we restrict a continuous function on a closed interval, it is always uniformly continuous.



Definition (Uniform continuity)

The function \(f(x)\) on an interval \(I\) is said to be uniformly continuous on \(I\) if it satisfies the following condition.

  • For any \(\varepsilon > 0\), there exists \(\delta > 0\), such that, for all \(x, y\in I\), \(|x - y| < \delta\) implies \(|f(x) - f(y)|< \varepsilon\).

  In a logical form, this condition is expressed as 

\[ \forall \varepsilon > 0, \exists \delta > 0, \forall x,y\in I ~ (|x-y| < \delta \implies |f(x) - f(y)| < \varepsilon).\label{eq:unifcont} \]

Remark. Compare the above condition for uniform continuity with the condition for the continuous function on \(I\):

\[ \forall y\in I, \forall \varepsilon > 0, \exists \delta > 0, \forall x \in I ~ (|x - y| < \delta \implies |f(x) - f(y)| < \varepsilon). \label{eq:contI} \]

What's the difference? In the uniform continuity, \(\delta\) does not depend on \(x\) or \(y\), whereas in the (ordinary) continuity on an interval, \(\delta\) may depend on \(y\). The latter means that we may need to choose different values of \(\delta\) depending on where we are in the interval \(I\). In uniform continuity, on the other hand, we can choose a constant \(\delta\) irrespective of where we are in the interval \(I\). In this sense, uniform continuity imposes a more stringent condition on the function. □

Example. The function \(f(x) = x^2\) is continuous on the entire \(\mathbb{R}\), but it is not uniformly continuous on \(\mathbb{R}\). To see this, let \(\varepsilon = 2\). If \(f(x)\) were uniformly continuous, we could choose some \(\delta\) such that for all \(x,y\in\mathbb{R}\), \(|x - y|< \delta\) implies \(|x^2 - y^2| < 2\). Now, let us pick a sufficiently large natural number \(n\) such that \(\frac{1}{n} < \delta\) (which is always possible), and let \(x = n + \frac{1}{n}\) and \(y = n\). Then \(|x - y| = \frac{1}{n} < \delta\), but \(|x^2 - y^2| = 2 + \frac{1}{n^2} > 2\), which is a contradiction. □

Example. Consider again the function \(f(x) = x^2\), but this time we restrict its domain to some finite closed interval \([a,b]\). Then, this function is uniformly continuous on \([a,b]\). Let \(c = \max\{|a|, |b|\}\). For any \(\varepsilon > 0\), let \(\delta = \frac{\varepsilon}{2c}\). Then for all \(x,y\in [a,b]\), if \(|x - y|< \delta\), noting \(|x|, |y| \leq c\),
\[|x^2 - y^2| = |x - y||x + y| < \delta\cdot|x+y| \leq \delta\cdot(|x|+|y|)\leq \delta\cdot 2c = \varepsilon.\] □

More generally, we have the following theorem.

Theorem

A continuous function on a closed interval is uniformly continuous.
Proof. We prove this by contradiction. Suppose the function \(f(x)\) that is continuous on the closed interval \([a,b]\) is not uniformly continuous. Then, the following holds:
  • There exists some \(\varepsilon > 0\) such that, for all \(\delta > 0\), there exists \(x,y \in [a,b]\) such that \(|x - y| < \delta\) and \(|f(x) - f(y)| \geq \varepsilon\).
Now for each \(n\in \mathbb{N}\), let \(\delta = \frac{1}{n}\), and there exist \(x_n, y_n\in [a,b]\) such that
\[ |x_n - y_n|< \frac{1}{n} \text{ and } |f(x_n) - f(y_n)| \geq \varepsilon. \]

Thus we can define sequences \(\{x_n\}\) and \(\{y_n\}\) bounded on \([a,b]\).

  By the Bolzano-Weierstrass theorem, \(\{x_n\}\) contains a subsequence \(\{x_{n_k}\}\) that converges to some \(\alpha \in [a,b]\). Using these indices \(n_1, n_2,\cdots\) in \(\{x_{n_k}\}\), we can define a subsequence \(\{y_{n_k}\}\) of \(\{y_n\}\). But \(\{y_{n_k}\}\) contains yet another subsequence that converges to some \(\beta\in[a,b]\). By reindexing, let us denote this converging subsequence by \(\{y_{n_k}\}\). After this reindexing, \(\{x_{n_k}\}\) still converges to \(\alpha\) (Theorem: a subsequence of a converging sequence converges to the same value.). Then \(x_{n_k} - y_{n_k}\) converges to \(\alpha - \beta\). However, for any \(k\), by assumption, we have \(-\frac{1}{n_k} < x_{n_k} - y_{n_k} < \frac{1}{n_k}\) and \(n_k \to \infty\) as \(k \to \infty\) so that \(\alpha - \beta = 0\) or \(\alpha = \beta\). On the one hand, \(f(x)\) is continuous so that \(\lim_{k\to\infty}f(x_{n_k}) = \lim_{k \to\infty}f(y_{n_k}) = f(\alpha)\). On the other hand, for any \(k\), \(|f(x_{n_k}) - f(y_{n_k})| \geq \varepsilon\) so that \(\lim_{k\to \infty}|f(x_{n_k}) - f(y_{n_k})| \geq \varepsilon\). Hence we have

\[0 = |f(\alpha) - f(\alpha)| = \lim_{k\to \infty}|f(x_{n_k}) - f(y_{n_k})| \geq \varepsilon > 0\]

which is a contradiction. ■

Comments

Popular posts from this blog

Open sets and closed sets in \(\mathbb{R}^n\)

Euclidean spaces

Newton's method