More on the limit of functions

 There are a few variations of limits.



Definition (Divergence of a function)

If the function \(f(x)\) does not converge to any value as \(x \to a\), we say that \(f(x)\) diverges as \(x \to a\). In particular, if the value of \(f(x)\) increases arbitrarily as \(x \to a\), we say that \(f(x)\) diverges to positive infinity and write

\[\lim_{x\to a}f(x) = \infty.\]

If the value of \(f(x)\) is negative and its absolute value increases arbitrarily, we say that \(f(x)\) diverges to negative infinity and write

\[\lim_{x\to a}f(x) = -\infty.\]

Example. Consider the function

\[f(x) = \frac{1}{(x - 1)^2}\]

defined on \(\mathbb{R}\setminus\{1\}\). We have

\[\lim_{x \to 1}f(x) = \infty.\]

You should verify this by drawing a graph. □

There are a few variants of the notion of limit.

Definition (Left and right limits)

  1. We define the left limit of the function \(f(x)\) at \(x=a\) \[\lim_{x \to a -0}f(x) = \alpha\] if the following is satisfied
    • For any \(\varepsilon > 0\), there exists \(\delta > 0\) such that for any \(x\in\text{dom}(f)\), if \(0 < a - x < \delta\) then \(|f(x) - \alpha| < \varepsilon\)
    This mean \(x\) approaches \(a\) from the left (so ``\(x < a\)'' is kept).
  2. We define the right limit of the function \(f(x)\) at \(x=a\) \[\lim_{x \to a +0}f(x) = \alpha\] if the following is satisfied
    • For any \(\varepsilon > 0\), there exists \(\delta > 0\) such that for any \(x\in\text{dom}(f)\), if \(0 < x - a < \delta\) then \(|f(x) - \alpha| < \varepsilon\).
    This mean \(x\) approaches \(a\) from the right (so ``\(a < x\)'' is kept).
Remark. We write \(x \to +0\) instead of \(x \to 0+0\). Similarly \(x\to -0\) rather than \(x \to 0-0\). □

Example. Consider the function 
\[f(x) = \frac{x(x + 1)}{|x|}\]
defined on \(\mathbb{R}\setminus\{0\}\). 
If \(x > 0\), we have
\[f(x) = \frac{x(x + 1)}{x} = x + 1\]
so that
\[\lim_{x \to +0}f(x) = 1.\]
If \(x < 0\), we have
\[f(x) = \frac{x(x + 1)}{-x} = -x - 1\]
so that
\[\lim_{x \to -0}f(x) = -1.\]

Example. For the function \(f(x) = \frac{1}{x}\) defined on \(\mathbb{R}\setminus\{0\}\), we have
\[ \begin{eqnarray*} \lim_{x\to +0}f(x) &=& \infty,\\ \lim_{x\to -0}f(x) &=& -\infty. \end{eqnarray*} \]

Definition (Limits at \(\pm \infty\))

  1. We define the limit at \(x \to \infty\) \[\lim_{x \to \infty}f(x) = \alpha\] if the following is satisfied
    • For any \(\varepsilon > 0\), there exists \(\delta > 0\) such that for any \(x\in\text{dom}(f)\), if \(x > \delta\) then \(|f(x) - \alpha| < \varepsilon\).
  2. We define the limit at \(x \to -\infty\) \[\lim_{x \to -\infty}f(x) = \alpha\] if the following is satisfied
    • For any \(\varepsilon > 0\), there exists \(\delta > 0\) such that for any \(x\in\text{dom}(f)\), if \(x < -\delta\) then \(|f(x) - \alpha| < \varepsilon\).

Theorem

Let \(f(x)\) be a function defined on the open interval \(I = (a, b)\) \((a < b)\) such that \(\lim_{x\to a+0}f(x) = \alpha\) \((\alpha \in\mathbb{R})\).
  1. For a given \(c\in\mathbb{R}\), if \(f(x) \geq c\) for all \(x\in I\), then \(\alpha \geq c\).
  2. For a given \(d\in\mathbb{R}\), if \(f(x) \leq d\) for all \(x\in I\), then \(\alpha \leq d\).
Proof. We prove only part 1 and prove it by contradiction. Part 2 is similar.
Suppose \(\alpha < c\) and let \(\varepsilon = c - \alpha > 0\). Since \(\lim_{x\to a+0}f(x) = \alpha\), we can find \(\delta > 0\) such that \(0 < x - a < \delta\) implies \(|f(x) - \alpha| < \varepsilon\). But then \(f(x) < \alpha + \varepsilon = c\), which is a contradiction. ■

Theorem

Let \(f(x)\) be a function defined on an interval that includes \(x= a\).
  1. If \(\lim_{x\to a}f(x) = \alpha\) exists, then the corresponding left and right limits also exist and \(\lim_{x\to a+0}f(x) = \lim_{x\to a-0}f(x) = \alpha\).
  2. If both \(\lim_{x\to a+0}f(x)\) and \(\lim_{x\to a-0}f(x)\) exist and \(\lim_{x\to a+0}f(x) = \lim_{x\to a-0}f(x) = \alpha\), then \(\lim_{x\to a}f(x) = \alpha\).
Proof. Exercise. ■

Example. Let us prove the following:
\[e = \lim_{x\to 0}(1 + x)^{\frac{1}{x}} = \lim_{x\to\infty}\left(1 + \frac{1}{x}\right)^x  = \lim_{x\to -\infty}\left(1 + \frac{1}{x}\right)^x.\]
Before proving this, recall that we have
\[\lim_{n\to \infty}\left(1 + \frac{1}{n}\right)^n = e\]
where \(n = 1, 2, 3, \cdots\) (i.e., natural numbers).

In order to show \(e = \lim_{x\to 0}(1 + x)^{\frac{1}{x}}\), we need to show \(e = \lim_{x\to +0}(1 + x)^{\frac{1}{x}}\) and \(e = \lim_{x\to -0}(1 + x)^{\frac{1}{x}}\). By changing variables \(x = \frac{1}{y}\), \(\lim_{x\to +0}(1 + x)^{\frac{1}{x}} = \lim_{y\to\infty}\left(1 + \frac{1}{y}\right)^y\) and \(\lim_{x\to -0}(1 + x)^{\frac{1}{x}} = \lim_{y\to -\infty}\left(1 + \frac{1}{y}\right)^y\). Furthermore,  by changing variables \(y = -t\), assuming \(t  > 1\), we have
\[ \begin{eqnarray*} \lim_{y\to -\infty}\left(1 + \frac{1}{y}\right)^y &=& \lim_{t\to \infty}\left(1 - \frac{1}{t}\right)^{-t}\\ &=& \lim_{t\to \infty}\left(\frac{t}{t-1}\right)^{t}\\ &=& \lim_{t\to \infty}\left(\frac{t}{t-1}\right)^{t-1}\left(\frac{t}{t-1}\right). \end{eqnarray*} \]
Since \(\lim_{t\to\infty}\left(1 + \frac{1}{t-1}\right) = 1\), we can see that we only need to show \(e = \lim_{x\to\infty}\left(1 + \frac{1}{x}\right)^{x}\).

For any \(x \geq 1\), we can find \(n\in\mathbb{N}\) such that
\[n\leq x \leq n+1.\]
Then we have
\[1 < 1 + \frac{1}{n+1} \leq 1 + \frac{1}{x} \leq 1 + \frac{1}{n}.\]
And hence,
\[\left(1 + \frac{1}{n+1}\right)^{n} \leq \left(1 + \frac{1}{x}\right)^{x} \leq \left(1 + \frac{1}{n}\right)^{n+1}.\]
Now,
\[\lim_{n\to\infty}\left(1 + \frac{1}{n+1}\right)^{n} = \lim_{n\to\infty}\frac{\left(1 + \frac{1}{n+1}\right)^{n+1}}{\left(1 + \frac{1}{n+1}\right)} = e,\]
and
\[\lim_{n\to\infty}\left(1 + \frac{1}{n}\right)^{n+1} = \lim_{n\to\infty}\left(1 + \frac{1}{n}\right)^{n}\left(1 + \frac{1}{n}\right) = e.\]
Therefore, by the Squeeze Theorem,
\[\lim_{x\to\infty}\left(1 + \frac{1}{x}\right)^{x} = e.\]

In a previous post, we have seen Cauchy's theorem for convergence of sequences (i.e., a Cauchy sequence converges and vice versa). 
A similar result hold for the convergence of functions.

Theorem (Cauchy criterion for the right limit of a function)

Let \(f(x)\) be a function on \(I = (a,b]\). The right limit \(\lim_{x\to a+0}f(x)\) exists if and only if the following condition is satisfied:
  • For any \(\varepsilon > 0\), there exists a \(\delta > 0\) such that for all \(x, y\in I\), if \(a < x < a + \delta\) and \(a < y < a + \delta\) then \(|f(x) - f(y)| < \varepsilon\).     (\(\star\))
In a logical form:
\[\forall \varepsilon > 0, \exists \delta > 0, \forall x,y\in I ~ (x,y \in (a,a+\delta) \implies |f(x) - f(y)|< \varepsilon).\]
Remark. Similarly, the left limit \(\lim_{x\to b-0}f(x)\) exists if and only if
  • For any \(\varepsilon > 0\), there exists a \(\delta > 0\) such that for all \(x, y\in I\), if \(b - \delta < x < b\) and \(b -\delta < y < b\) then \(|f(x) - f(y)| < \varepsilon\).
The limit \(\lim_{x\to\infty}f(x)\) exists if and only if
  • For any \(\varepsilon > 0\), there exists a \(\delta > 0\) such that for all \(x, y\in I\), if \(x > \delta\) and \(y > \delta\) then \(|f(x) - f(y)| < \varepsilon\).
The limit \(\lim_{x\to-\infty}f(x)\) exists if and only if
  • For any \(\varepsilon > 0\), there exists a \(\delta > 0\) such that for all \(x, y\in I\), if \(x < -\delta\) and \(y < -\delta\) then \(|f(x) - f(y)| < \varepsilon\).
Proof. (\(\Rightarrow\)) Suppose \(\lim_{x\to a+0}f(x)\) exists. Then there exists some real number \(\alpha\) such that \(\lim_{x\to a+0}f(x) = \alpha\). For any \(\varepsilon > 0\), there exists \(\delta > 0\) such that, for all \(x\in I\), if \(a < x < a + \delta\) then \(|f(x) - \alpha| < \frac{\varepsilon}{2}\). Let \(x\) and \(y\) be arbitrary real numbers such that \(a < x < a + \delta\) and \(a < y < a + \delta\). Then
\[ \begin{eqnarray*} |f(x) - f(y)| & = & |f(x) - \alpha + \alpha - f(y)|\\ &\leq &|f(x) - \alpha| + |f(y) - \alpha|\\ &<& \frac{\varepsilon}{2} + \frac{\varepsilon}{2} = \varepsilon. \end{eqnarray*} \]
Therefore the condition (\(\star\)) is satisfied.

(\(\Leftarrow\)) Suppose (\(\star\)) is satisfied. Let us define a sequence \(\{a_n\}\) by \(a_n = a + \frac{b-a}{n}\), \(n\in\mathbb{N}\). For any \(\varepsilon > 0\),  choose a \(\delta > 0\) satisfying (\(\star\)). By Archimedes' principle, we can find an \(N\in \mathbb{N}\) such that \(N\delta > b - a\). Then for any \(n \geq N\), \(a_n - a  = \frac{b-a}{n} < \frac{b-a}{N} < \delta\) so that \(a < a_n < a + \delta\). Therefore by (\(\star\)), for all \(n, m > N\), \(|f(a_n) - f(a_m)| < \varepsilon\) so that \(\{f(a_n)\}\) is a Cauchy sequence, and hence this sequence converges: \(\lim_{n\to\infty}f(a_n) = \alpha\) for some \(\alpha \in \mathbb{R}\).

Now, let us show that \(\lim_{x\to a+0}f(x) = \alpha\). By (\(\star\)), for any \(\varepsilon > 0\), we can find \(\delta > 0\) such that for all \(x,y \in (a, a+\delta)\), \(|f(x) - f(y)|< \frac{\varepsilon}{2}\). Also, since \(\lim_{n\to\infty}f(a_n) = \alpha\), we can find an \(N\in\mathbb{N}\) such that for all \(n \geq N\), \(|f(a_n) - \alpha| < \frac{\varepsilon}{2}\). If \(a < x < a + \delta\), choose \(n \geq N\) such that \(a < a_n < a+\delta\). Then
\[ \begin{eqnarray*} |f(x) - \alpha| &=& |f(x) - f(a_n) + f(a_n) - \alpha|\\ &\leq&|f(x) - f(a_n)| + |f(a_n) - \alpha|\\ &<& \frac{\varepsilon}{2} + \frac{\varepsilon}{2} = \varepsilon. \end{eqnarray*} \]
Therefore,
\[\lim_{x\to a+0}f(x) = \alpha.\]


Comments

Popular posts from this blog

Open sets and closed sets in \(\mathbb{R}^n\)

Euclidean spaces

Newton's method