Branching processes: Probability of extinction

See also: Branching processes: Mean and variance.

We continue our study of branching processes. That is, \(X_n\) is the random variable representing the population size of the \(n\)-th generation, its generating function \(G_n(s)\) satisfies the following equation:

\[G_n(s) = G_{n-1}(G(s)) = G(G(\cdots(G(s))\cdots))\]

where 

\[G(s) = G_1(s) = \sum_{j=0}^{\infty}p_js^j.\]

The initial condition \(X_0 = 1\) and the probabilities \(p_j, j = 0, 1, \cdots,\) of each individual producing \(j\) descendants are assumed to be known.

\[G_1(s) = G(s) = \sum_{j=0}^{\infty}p_js^j.\]

Population size of zero, \(X_n = 0\), means extinction. The probability of extinction can be computed as \(G_n(0)\) (why?).

Let \(g_n = G_n(0)\). \(g_1\) is the probability of extinction in the first generation. Recall the definition

\[G_1(s) = G(s) = \sum_{j=0}^{\infty}p_js^j.\]

Thus, \(g_1 = G_1(0) = p_0\): the probability that \(X_1 = 0\).

Assume \(0 \leq s \leq 1\). Of course, \(0 \leq p_j \leq 1\) for all \(j\). The function \(G(s)\) has the following properties.

  1. \(G(1) = 1\), i.e., the graph of \(s\)-vs.-\(G(s)\) always passes the point \((1,1)\).
  2. \(G(s) > 0\) if \(p_0 > 0\), i.e., \(G(s)\) is positive;
  3. \(G'(s) = \sum_{j=1}^{\infty}jp_js^{j-1} \geq 0\), i.e., \(G(s)\) is non-decreasing;
  4. \(G''(s) = \sum_{j=2}^{\infty}j(j-1)s^{j-2} \geq 0\), i.e., \(G(s)\) is convex.

Lemma (convex function)

Let \(f(x)\) be a twice differentiable, convex function defined on \([0,1]\). That is,
\[f''(x) \geq 0\]
for all \(x\in [0,1]\). Then, if \(y > x\), then
\[\frac{f(y) - f(x)}{y - x} \geq f'(x).\]
Proof. Exercise. ■


The sequence of extinction probabilities \(\{g_n\}\) satisfies
\[g_n = G_n(0) = G(G_{n-1}(0)) = G(g_{n-1}),~~ n = 2, 3, \cdots.\]
In particular,
\[g_2 = G_2(0) = G(G(0)) = G(g_1) = \sum_{j=0}^{\infty}p_jg_1^j > p_0 = G(0) = g_1.\]
From the above lemma and the property (3) of \(G(s)\), we have
\[\frac{G(g_n) - G(g_{n-1})}{g_n - g_{n-1}} =   \frac{g_{n+1} - g_{n}}{g_n - g_{n-1}} \geq 0\]
if \(g_{n} > g_{n-1}\). Now, the sequence \(\{g_n\}\) is bounded from above, that is, \(g_n \leq 1\) for all \(n\). There is a famous theorem:

Theorem

A bounded monotone increasing sequence converges.
Proof. See any calculus textbook. ■

Therefore, the limit
\[g = \lim_{n\to\infty}g_n\]
exists. Since \(g_n = G(g_{n-1})\), we should have
\[g = G(g). \tag{Eq:FixedPoint}\]
In other words, the function \(G(s)\) has a fixed point, and that fixed point is the limit \(g = \lim_{n\to\infty}g_n\). 

There is a way to graphically "solve" Eq. (Eq:FixedPoint) (See Figure 1 below). The solution \(g\) is the intersection between \(y = G(x)\) and \(y = x\). But note \(G(1) = 1\) so \(g = 1\) is always a solution. There may or may not be another solution. When there is, that solution (\(g < 1\)) is the "stable" solution.

Figure 1: Graphical solution of the fixed point equation \(g = G(g)\). Starting from \((g_0,0)\), a sequence of points is generated by the following rule: \((g_0,0) \to (g_0,G(g_0)) = (g_0,g_1)\) \(\to (g_1,g_1) \to (g_1,G(g_1)) = (g_1,g_2)\) \(\to \cdots \to\) \((g_n,g_n) \to (g_n,G(g_n)) = (g_n,g_{n+1})\to \cdots\) Eventually, we will arrive at the fixed point \((g,G(g)) = (g,g)\) which is at the intersection between the lines \(y = G(x)\) and \(y = x\).


Exercise. Consider the case when there is a solution of \(g = G(g)\) such that \(g < 1\). If \(p_0 = g_0 > g\), (1) does the graphical solution technique converge to the point \((g,g)\)? In particular, (2) doesn't it converge to the point \((1,1)\)? 

(Answer: (1) Yes. (2) No. (but why?)) □

Note the following:
  1. If \(G'(1) > 1\) then \(g_n \to g < 1\).
  2. If \(G'(1) \leq 1\) then \(g_n \to 1\).
You should think of the reason why this is the case (hint: \(G'(1)\) is the slope of the curve \(y = G(x)\) at \(x = 1\)). By the way, \(G'(1) = \mu_1\) is the mean of the first generation, which is the expected number of offspring per individual. Thus, if the expected number of offspring is less than 1, the population will become extinct almost surely.





Comments

Popular posts from this blog

Open sets and closed sets in \(\mathbb{R}^n\)

Euclidean spaces

Applications of multiple integrals