Calculus of power series

Functions defined by power series

If the power series \(\sum_{n=0}^{\infty}a_nx^n\) has the radius of convergence \(r > 0\), then it defines a function \(f(x) = \sum_{n=0}^{\infty}a_nx^n\) on open interval \((-r, r)\).



Theorem (Continuous power series)

Let \(\sum_{n=0}^{\infty}a_nx^n\) be a power series with its radius of convergence \(r > 0\). Then, the function \(f(x) = \sum_{n=0}^{\infty}a_nx^n\) is continuous on \((-r, r)\).

Proof. It suffices to show that \(f(x)\) is continuous on open interval \(I = (-s, s)\) where \(s\) is an arbitrary real number such that \(0 < s < r\). Let \(t = \frac{r + s}{2}\). Then \(s < t < r\) and the power series \(\sum_{n=0}^{\infty}a_nx^n\) converges absolutely at \(x = t\).

Step 1. For the partial sum \(f_n(x) = \sum_{k=0}^{n}a_kx^k\), we show the following:

  • (*) For any \(\varepsilon > 0\), there exists an \(N\in\mathbb{N}_0\) such that, for all \(n\in\mathbb{N}_0\) and all \(x\in I\), if \(n \geq N\), then \(|f(x) - f_n(x)| < \varepsilon\).
(In other words, \(\{f_n(x)\}\) uniformly converges to \(f(x)\).) Take an arbitrary \(\varepsilon > 0\). Since the series \(\sum_{n=0}^{\infty}|a_n|t^n\) has a sum, \(\lim_{n\to\infty}|a_n|t^n = 0\). In particular, the sequence \(\{|a_n|t^n\}\) is bounded so that there exists \(M>0\) such that \(|a_n|t^n < M\) for all \(n\). Also, since \(0 < \frac{s}{t} < 1\), the geometric series \(\sum_{n=0}^{\infty}\left(\frac{s}{t}\right)^n\) converges. Therefore, we can find a sufficiently large \(N\) such that \(\sum_{k=n+1}^{\infty}\left(\frac{s}{t}\right)^k < \frac{\varepsilon}{M}\) for all \(n \geq N\).
Noting that, for all \(n\geq N\) and \(x \in (-s, s)\), \(|x| < s\), we have
\[\begin{eqnarray*} |f(x) - f_n(x)| & = & \left|\sum_{k=n+1}^{\infty}a_kx^k\right| \leq \sum_{k=n+1}^{\infty}|a_k||x|^k\\ & = & \sum_{k=n+1}^{\infty}|a_k|t^k\left(\frac{|x|}{t}\right)^k < \sum_{k=n+1}^{\infty}|a_k|t^k\left(\frac{s}{t}\right)^k\\ & < & \sum_{k=n+1}^{\infty}M\left(\frac{s}{t}\right)^k < \varepsilon. \end{eqnarray*}\]
Thus, (*) holds. This means that the function \(f(x)\) can be approximated by a polynomial function \(f_n(x)\) to arbitrary precision. In other words, \(f(x) = \lim_{n\to\infty}f_n(x)\).

Step 2. Next, consider a sequence of functions \(\{f_n(x)\}\) that are continuous on interval \(I\) and satisfy the property (*) in Step 1. Then, we show that its limit \(f(x) = \lim_{n\to\infty}f_n(x)\) is also continuous on \(I\).
Take an arbitrary \(a \in I\). Let \(\varepsilon > 0\). By (*), for all \(x \in I\), there exists some \(n\) such that \(|f(x) - f_n(x)| < \frac{\varepsilon}{3}\); in particular, \(|f(a) - f_n(a)| < \frac{\varepsilon}{3}\). Furthermore, since \(f_n(x)\) is continuous, there exists some \(\delta > 0\) such that, if \(|x - a| < \delta\), then \(|f_n(x)  - f_n(a)| < \frac{\varepsilon}{3}\). Combining these together, we have
\[\begin{eqnarray*} |f(x) - f(a)| & \leq & |f(x) - f_n(x)| + |f_n(x) - f_n(a)| + |f_n(a) - f(a)|\\ & < & \frac{\varepsilon}{3} +\frac{\varepsilon}{3} +\frac{\varepsilon}{3} = \varepsilon. \end{eqnarray*}\]
Now,
\[\sum_{n=0}^{\infty}a_nx^n = \lim_{n\to\infty}\sum_{k=0}^{n}a_kx^k\]
and polynomial functions (i.e., partial sums) are continuous (everywhere). Therefore, \(f(x)\) is continuous on \(I\). ■

Term-wise integration and term-wise differentiation

Given the power series \(f(x) = \sum_{n=0}^{\infty}a_nx^n\), consider the following power series:
\[\begin{eqnarray*} F(x) &=& \sum_{n=0}^{\infty}\frac{a_n}{n+1}x^{n+1},\\ g(x) &=& \sum_{n=1}^{\infty}na_nx^{n-1}. \end{eqnarray*}\]
If \(f(x)\) is a finite sum (i.e., polynomial function; \(\exists N\in\mathbb{N} ~ \forall n \geq N ~ (a_n = 0)\)), then \(F(x)\) and \(g(x)\) are the primitive function and derivative of \(f(x)\), respectively. As we show below, this holds true even when \(f(x)\) is not a polynomial. That is, if \(f(x)\) has the radius of convergence \(r > 0\), then \(F(x)\) is the primitive function of \(f(x)\) and \(g(x)\) is the derivative of \(f(x)\) on the open interval \((-r, r)\). This means that we can obtain the primitive function and derivative of a function defined by a power series by term-wise (term-by-term) integration and term-wise (term-by-term) differentiation of \(f(x)\), respectively.

Theorem (Term-wise integration of power series)

Suppose the power series \(f(x) = \sum_{n=0}^{\infty}a_nx^n\) has a radius of convergence \(r>0\). Then, the following holds on \((-r, r)\):
\[\int_0^xf(t)dt = \sum_{n=0}^{\infty}\frac{a_n}{n+1}x^{n+1}.\]
Proof. Consider the sequence \(\{f_n(x)\}\) of polynomial functions given by the partial sums \(f_n(x) = \sum_{k=0}^{n}a_kx^k\). For \(x\in (-r, r)\), take an \(s\in\mathbb{R}\) such that \(|x| < s < r\). Then, on the interval \(I = (-s, s)\), we have \(f(x) = \lim_{n\to\infty}f_n(x)\) (c.f., the property (*) in the proof of the above Theorem (Continuous power series)). For any \(\varepsilon > 0\), there exists an \(N\in \mathbb{N}_0\) such that, for all \(n\geq N\), \(|f(x) - f_n(x)| < \varepsilon\). Thus, if \(n \geq N\), then
\[\left|\int_0^xf(t)dt - \int_0^xf_n(t)dt\right| \leq \int_0^x|f(t) - f_n(t)|dt < \varepsilon|x|.\]
Therefore,
\[\lim_{n\to\infty}\int_0^xf_n(t)dt =\int_0^xf(t)dt.\]
Since
\[\int_0^x\left(\sum_{k=0}^{n}a_kt^k\right) dt = \sum_{k=0}^{n}\left(\int_0^xa_kt^k dt\right) = \sum_{k=0}^{n}\frac{a_k}{k+1}x^{k+1},\]
we have
\[\int_0^xf(t)dt = \sum_{n=0}^{\infty}\frac{a_n}{n+1}x^{n+1}.\]

Theorem (Term-wise differentiation of power series)

Given the power series \(f(x) = \sum_{n=0}^{\infty}a_nx^n\), consider the power series \(g(x) = \sum_{n=1}^{\infty}na_nx^{n-1}\). The following hold:
  1. The radius of convergence of \(g(x)\) is equal to that of \(f(x)\).
  2. If \(f(x)\) has a radius of convergence \(r > 0\), then \(f(x)\) is differentiable on the open interval \((-r, r)\) and \(f'(x) = g(x)\).
Proof
  1. Let \(r\) and \(r'\) be the radii of convergence of \(f(x)\) and \(g(x)\), respectively. First, we show that \(r \leq r'\). This is trivial if \(r = 0\). Suppose \(r > 0\). It suffices to show that \(g(x)\) converges absolutely for all \(x\) such that \(|x| < r\). Let \(s\) be such that \(|x| < s < r\). Since the series \(\sum_{n=0}^{\infty}|a_n|s^n\) has a sum, there exists \(M\) such that \(|a_n|s^n < M\) for all \(n\). Let \(t = \frac{|x|}{s}\), then \(0 < t < 1\) and \[\sum_{n=1}^{\infty}|na_nx^{n-1}| = \sum_{n=1}^{\infty}\frac{n}{s}|a_ns^n|t^{n-1} < M\sum_{n=1}^{\infty}\frac{n}{s}t^{n-1}.\tag{Eq:gbound}\] But \[\lim_{n\to\infty}\frac{\frac{n+1}{s}t^n}{\frac{n}{s}t^{n-1}} = \lim_{n\to\infty}\frac{(n+1)t}{n} = t < 1,\] so by D'Alembert's criterion, the right-hand side of (Eq:gbound) has a sum, and hence \(g(x)\) converges absolutely. Next, we show that \(r' \leq r\). This is trivial if \(r' = 0\). Suppose \(r' > 0\). It suffices to show that \(f(x)\) converges absolutely for all \(x \in (-r', r)\). But this is true because for all \(n\geq 1\), \[|a_nx^n| \leq |x||na_nx^{n-1}| < r'|na_nx^{n-1}|.\] That is, \(r'g(x)\) is a dominating series of \(f(x)\). Therefore, we have shown that \(r = r'\).
  2. By Part 1, \(g(x)\) is a power series with the radius of convergence \(r > 0\). By term-wise integration, for all \(x \in (-r, r)\), we have \[\int_0^xg(t)dt = \sum_{n=1}^{\infty}a_nx^n = f(x) - a_0.\] The left-hand side is differentiable with respect to \(x\). Thus, \(f(x)\) is also differentiable. By differentiating both sides, we have \(f'(x) = g(x)\) (c.f. the Fundamental Theorem of Calculus).

Corollary 1

Suppose the power series \(f(x) = \sum_{n=0}^{\infty}a_nx^n\) has a radius of convergence \(r > 0\). Then, the function \(f(x)\) is of class \(C^\infty\) on the open interval \((-r, r)\), and its derivative \(f'(x)\) is the power series \(\sum_{n=1}^{\infty}na_nx^{n-1}\) with the radius of convergence \(r\).
Proof. Exercise. ■

Corollary 2

Suppose the power series \(f(x) = \sum_{n=0}^{\infty}a_nx^n\) has a radius of convergence \(r > 0\). For each \(k = 0, 1, 2, \cdots\), we have
\[a_k = \frac{f^{(k)}(0)}{k!}\tag{Eq:maccof}\]
where \(f^{(k)}(x)\) is the \(k\)-th derivative of \(f(x)\).
Proof. For \(k = 0\), (Eq:maccof) is trivial. By repeatedly term-wise differentiating \(f(x)\), we can show by mathematical induction (exercise!) that
\[f^{(k)}(x) = \sum_{n=k}^{\infty}n(n-1)\cdots(n-k+1)a_nx^{n-k}.\]
Substituting \(x = 0\) to this, we have
\[f^{(k)}(0) = k!a_k.\]



Comments

Popular posts from this blog

Open sets and closed sets in \(\mathbb{R}^n\)

Euclidean spaces

Newton's method