Introductory university-level calculus, linear algebra, abstract algebra, probability, statistics, and stochastic processes.
Monotone sequences and Cauchy sequences
Get link
Facebook
X
Pinterest
Email
Other Apps
-
Deciding whether a given sequence \(\{a_n\}\) converges or diverges is usually very difficult. Sometimes it is possible to decide if a sequence converges without knowing its limit explicitly if certain conditions are met.
This sequence is monotone decreasing but it cannot be arbitrarily small because it is also bounded below. Therefore it must converge to some number (namely \(\sqrt{2}\) in this case). That such a real number exists is guaranteed by the continuity of real numbers. □
The observation in the above example can be generalized.
Theorem (Monotone convergence theorem)
Any bounded monotone sequence converges.
Proof. Suppose \(\{a_n\}\) is a bounded monotone increasing sequence. Then the set \(S = \{a_n\mid n\in\mathbb{N}\}\) is bounded above so that its supremum \(\alpha\) exists. For any \(\varepsilon > 0\), \(\alpha - \varepsilon\) is not an upper bound of \(S\) so there exists \(N\in\mathbb{N}\) such that \(\alpha - \varepsilon < a_N\). Since \(\{a_n\}\) is monotone increasing, for all \(n\geq N\), \(a_n \geq a_N > \alpha - \varepsilon\). Since \(\alpha\) is the supremum of \(\{a_n\}\), we have \(a_n \leq \alpha < \alpha + \varepsilon\). Thus we have \(\alpha -\varepsilon < a_n < \alpha + \varepsilon\) or \(|a_n -\alpha| < \varepsilon\) for all \(n\geq N\). Hence \(\lim_{n\to\infty}a_n = \alpha\).
We can prove similarly for the case of a bounded monotone decreasing sequence. ■
Definition (Cauchy sequence)
The sequence \(\{a_n\}\) is said to be a Cauchy sequence if and only if the following condition is met.
For any \(\varepsilon > 0\), there exists \(N\in\mathbb{N}\) such that for any \(k, l\geq N\), \(|a_k - a_l| < \varepsilon\).
Or, in a logical form,
\[\forall \varepsilon > 0, \exists N\in\mathbb{N}, \forall k,l\in\mathbb{N} ~ (k, l \geq N \implies |a_k - a_l| < \varepsilon).\]
Theorem
Any convergent sequence is a Cauchy sequence.
Proof. Suppose \(\lim_{n\to\infty}a_n = \alpha\). For any \(\varepsilon > 0\), there exists \(N\in\mathbb{N}\) such that for all \(k,l \geq N\),
\(2\varepsilon\) is an arbitrary positive real number. Therefore, \(\{a_n\}\) is a Cauchy sequence. ■
The converse is also true, but the proof is beyond the scope of this lecture.
Theorem
Any Cauchy sequence converges.
Corollary
A sequence converges if and only if it is a Cauchy sequence.
For later convenience, we provide the following theorem without proof.
Theorem (Bolzano-Weierstrass Theorem)
Let \(\{a_n\}\) be a sequence such that \(a_n \in [c, d]\) for all \(n\in\mathbb{N}\). Then there exists a subsequence \(\{a_{n_k}\}\) of \(\{a_n\}\) that converges to a value in the closed interval \([c,d]\).
Proof. Exercise. (Hint: use the squeeze theorem) ■
Defining the birth process Consider a colony of bacteria that never dies. We study the following process known as the birth process , also known as the Yule process . The colony starts with \(n_0\) cells at time \(t = 0\). Assume that the probability that any individual cell divides in the time interval \((t, t + \delta t)\) is proportional to \(\delta t\) for small \(\delta t\). Further assume that each cell division is independent of others. Let \(\lambda\) be the birth rate. The probability of a cell division for a population of \(n\) cells during \(\delta t\) is \(\lambda n \delta t\). We assume that the probability that two or more births take place in the time interval \(\delta t\) is \(o(\delta t)\). That is, it can be ignored. Consequently, the probability that no cell divides during \(\delta t\) is \(1 - \lambda n \delta t - o(\delta t)\). Note that this process is an example of the Markov chain with states \({n_0}, {n_0 + 1}, {n_0 + 2}...
Generational growth Consider the following scenario (see the figure below): A single individual (cell, organism, etc.) produces \(j (= 0, 1, 2, \cdots)\) descendants with probability \(p_j\), independently of other individuals. The probability of this reproduction, \(\{p_j\}\), is known. That individual produces no further descendants after the first (if any) reproduction. These descendants each produce further descendants at the next subsequent time with the same probabilities. This process carries on, creating successive generations. Figure 1. An example of the branching process. Let \(X_n\) be the random variable representing the population size (number of individuals) of generation \(n\). In the above figure, we have \(X_0 = 1\), \(X_1=4\), \(X_2 = 7\), \(X_3=12\), \(X_4 = 9.\) We shall assume \(X_0 = 1\) as the initial condition. Ideally, our goal would be to find how the population size grows through generations, that is, to find the probability \(\Pr(X_n = k)\) for e...
The birth-death process Combining birth and death processes with birth and death rates \(\lambda\) and \(\mu\), respectively, we expect to have the following differential-difference equations for the birth-death process : \[\begin{eqnarray}\frac{{d}p_0(t)}{{d}t} &=& \mu p_1(t),\\\frac{{d}p_n(t)}{{d}t} &=& \lambda(n-1)p_{n-1}(t) - (\lambda + \mu)np_n(t) + \mu(n+1)p_{n+1}(t),~~(n \geq 1).\end{eqnarray}\] You should derive the above equations based on the following assumptions: Given a population with \(n\) individuals, the probability that an individual is born in the population during a short period \(\delta t\) is \(\lambda n \delta t + o(\delta t)\). Given a population with \(n\) individuals, the probability that an individual dies in the population is \(\mu n \delta t + o(\delta t)\). The probability that multiple individuals are born or die during \(\delta t\) is negligible. (The probability of one birth and one death during \(\delta t\) is also negligible.) Consequ...
Comments
Post a Comment