Power series

In this post, we deal with a class of series called the power series that contains a variable.



Definition (Power series)

Let \(\{a_n\}\) be a sequence of real numbers, \(b\) a real number, and \(x\) a variable \(x\). The series given by

\[\sum_{n=0}^{\infty}a_n(x-b)^n = a_0 + a_1(x-b) + a_2(x-b)^2 + \cdots\tag{Eq:PS}\]

is called a power series centered at \(x=b\).

If we set \(t = x - b\) in (Eq:PS), we obtain a power series centered at \(t = 0\). In most practical cases, it suffices to deal with power series centered at \(x = 0\).

Example. A polynomial of \(x\), \(f(x) = a_0 + a_1x + a_2x^2 + \cdots + a_nx^n\) can be regarded as a power series by setting \(a_{n+1} = a_{n+2} = \cdots = 0\).

In general, the power series \(\sum_{n=0}^{\infty}a_nx^n\) is a polynomial of \(x\) if \(a_n = 0\) for all but finitely many \(n\). □

If the power series \(\sum_{n=0}^{\infty}a_nx^n\) is a polynomial, we can substitute an arbitrary real number to \(x\) to calculate the sum. If it is not a polynomial (i.e., \(a_n \neq 0\) for infinitely many \(n\)), then substituting real numbers to \(x\) other than 0 may result in the divergence of the power series. As long as the power series converges (i.e., has a sum), we may regard it as a function of \(x\).

Radius of convergence

Theorem (Convergence of power series)

If the power series \(\sum_{n=0}^{\infty}a_nx^n\) converges at \(x = u (\neq 0)\), then it converges absolutely for all \(x\) such that \(|x| < |u|\).

Proof. Since \(\sum_{n=0}^{\infty}a_nu^n\) has a sum, the sequence \(\{a_nu^n\}\) converges to 0. In particular, \(\{a_nu^n\}\) is bounded. Therefore, there exists an \(M > 0\) such that \(|a_nu^n| < M\) for all \(n \geq 0\). For any \(x\in \mathbb{R}\), let us define \(r = \left|\frac{x}{u}\right|\). We have

\[|a_nx^n| = \left|a_nu^n\cdot\frac{x^n}{u^n}\right| \leq Mr^n.\]

If \(|x| < |u|\), then \(r < 1\). Therefore, the series \(\sum_{n=0}^{\infty}|a_nx^n|\) has a dominating series \(\sum_{n=0}^{\infty}Mr^n\). It follows that \(\sum_{n=0}^{\infty}a_nx^n\) converges absolutely if \(|x| < |u|\). ■

Definition (Radius of convergence)

Given a power series \(\sum_{n=0}^{\infty}a_nx^n\), the quantity defined by

\[r = \sup\left\{|u| ~ \middle| ~ \text{$\sum_{n=0}^{\infty}a_nu^n$ converges}\right\}\]

is called the radius of convergence of the power series \(\sum_{n=0}^{\infty}a_nx^n\).

Clearly, the power series \(\sum_{n=0}^{\infty}a_nx^n\) converges if \(x = 0\). Therefore \(r \geq 0\). It is possible that \(r = +\infty\).

From the above theorem (convergence of power series), we can see the following (\(r\) is the radius of convergence):

  • If \(0 < r < +\infty\), the power series \(\sum_{n=0}^{\infty}a_nx^n\) converges absolutely at all \(x\) such that \(|x| < r\), and diverges at all \(x\) such that \(|x| > r\).
  • If \(r = +\infty\), the power series \(\sum_{n=0}^{\infty}a_nx^n\) converges at all \(x \in \mathbb{R}\).
  • If \(r = 0\), the power series \(\sum_{n=0}^{\infty}a_nx^n\) diverges at all \(x \neq 0\).
In particular, if \(r > 0\), we can regard \(f(x) = \sum_{n=0}^{\infty}a_nx^n\) as a function of \(x\) on the open interval \((-r, r)\).

Example. The power series \(\sum_{n=0}^{\infty}x^n\) converges for \(|x| < 1\) and diverges for \(|x| > 1\). Therefore, its radius of convergence is 1. □

Remark. When the radius of convergence of the power series \(\sum_{n=0}^{\infty}a_nx^n\) is \(r\), we cannot generally decide whether the series converges or not at \(|x| = r\). □

Calculating the radius of convergence

From the Cauchy and D'Alembert criteria, we have the following theorem:

Theorem (Radius of convergence)

For the power series \(\sum_{n=0}^{\infty}a_nx^n\) with its radius of convergence \(r\), the following hold:
  1. If \(l = \lim_{n\to\infty}\sqrt[n]{|a_n|}\), then \(r = \frac{1}{l}\).
  2. If \(l = \lim_{n\to\infty}\left|\frac{a_{n+1}}{a_n}\right|\), then \(r = \frac{1}{l}\).
Remark. In this theorem, we include the case where the limit does not exist, i.e., \(l = +\infty\), and formally define \(\frac{1}{0} = +\infty\) and \(\frac{1}{\infty} = 0\). □
Proof. Exercise. ■

See also: Convergence of series for Cauchy and D'Alembert's criteria.

Example. Let us find the radius of convergence of \(\sum_{n=0}^{\infty}\frac{x^n}{n!}\).
For \(a_n = \frac{1}{n!}\), we have
\[\lim_{n\to\infty}\left|\frac{a_{n+1}}{a_{n}}\right| = \lim_{n\to\infty}\frac{1}{n+1} = 0.\]
Thus, the radius of convergence is \(+\infty\). □

Example. Consider the following series:
\[\sum_{n=0}^{\infty}x^{2n} = 1 + 0\cdot x + x^2 + 0\cdot x^3 + x^4 + \cdots.\tag{Eq:x2n}\]
The coefficient \(a_n\) is either 1 or 0. Thus, \(\lim_{n\to\infty}\sqrt[n]{|a_n|}\) does not exist, and the sequence \(\left\{\frac{a_{n+1}}{a_n}\right\}\) cannot be defined. Therefore, we cannot use the above theorem to find the radius of convergence for this series.  □

However, if we use the limit superior, we can always write the radius of convergence as
\[r = \frac{1}{\limsup\limits_{n\to\infty}\sqrt[n]{|a_n|}}.\]


Example. The series in (Eq:x2n), 
\[\limsup\limits_{n\to\infty}\sqrt[n]{|a_n|} = 1.\]
Therefore, the radius of convergence is 1. □





Comments

Popular posts from this blog

Open sets and closed sets in \(\mathbb{R}^n\)

Euclidean spaces

Newton's method