Birth process

 

Defining the birth process 

Consider a colony of bacteria that never dies. We study the following process known as the birth process, also known as the Yule process.

  1. The colony starts with \(n_0\) cells at time \(t = 0\).
  2. Assume that the probability that any individual cell divides in the time interval \((t, t + \delta t)\) is proportional to \(\delta t\) for small \(\delta t\).
  3. Further assume that each cell division is independent of others.
  4. Let \(\lambda\) be the birth rate. The probability of a cell division for a population of \(n\) cells during \(\delta t\) is \(\lambda n \delta t\).
  5. We assume that the probability that two or more births take place in the time interval \(\delta t\) is \(o(\delta t)\). That is, it can be ignored.
  6. Consequently, the probability that no cell divides during \(\delta t\) is \(1 - \lambda n \delta t - o(\delta t)\).

Note that this process is an example of the Markov chain with states \({n_0}, {n_0 + 1}, {n_0 + 2}, \cdots\) where each integer \(n\) represents the state with \(n\) individuals. 

The population size (i.e., the number of individuals) at time \(t\) is a random variable, \(N(t)\). We denote by \(p_n(t)\) the probability that \(N(t)= n\). That is, \[p_n(t) = \Pr(N(t) = n).\] Our goal here is to find this probability distribution \(p_n(t)\).

A sample path of a birth process is shown below:





Differential-difference equations

Based on the above assumptions, let us derive a set of differential-difference equations for \(p_n(t)\). 

First of all, the initial condition is given by \[p_n(0) = \delta_{n,n_0} \tag{Eq:init}\] where \(\delta_{n,n_0}\) is Kronecker's delta.

Next, let \(\delta t > 0\) be a sufficiently small time interval. For \(N(t + \delta t) = n\) to hold, we should have one of the following possibilities:

  • \(N(t) = n - 1\) and a cell division has taken place during \(\delta t\), or 
  • \(N(t) = n\) and no cell divisions have occurred during \(\delta t\).
Thus, \[p_{n}(t + \delta t) = p_{n-1}(t)[\lambda(n-1)\delta t + o(\delta t)] + p_n(t)[1 - \lambda n \delta t + o(\delta t)]\] for \(n \geq n_0 + 1\), and
\[p_{n_0}(t + \delta t) = p_{n_0}(t)[1 - \lambda n_0 \delta t + o(\delta t)]\] for \(n = n_0\). By rearranging these equations and taking the limit \(\delta t \to 0\), we have the following differential-difference equations:
\[\begin{eqnarray} \frac{{d}p_{n_0}(t)}{{d}t} &=& -\lambda n_0 p_{n_0}(t),\tag{Eq:Birth0}\\ \frac{{d}p_{n}(t)}{{d}t} &=& \lambda(n-1)p_{n-1}(t)-\lambda n p_{n}(t), ~~ (n \geq n_0 + 1)\tag{Eq:Birthn} \end{eqnarray}\]

In the next post, we will solve these equations and show that \(\{p_n(t)\}\) is given by
\[\begin{equation} p_n(t) = \begin{cases} 0, & (n < n_0),\\ \binom{n-1}{n_0-1}e^{-\lambda n_0 t}(1 - e^{-\lambda t})^{n-n_0}, & (n \geq n_0). \end{cases} \end{equation}\]
That is, \(N(t)\) follows the Pascal or negative binomial distribution, \(\mathrm{NB}(n_0, e^{-\lambda t})\).

Related videos



Generating function equation

Let's solve these equations. We use the method of the probability generating function (PGF). In the present case, the PGF with the dummy variable \(s\) is defined as

\[G(s,t) = \sum_{n=n_0}^{\infty}p_n(t)s^n.\tag{Eq:PGF}\]

The initial condition (Eq:init) reads as \[G(s,0) = s^{n_0}.\tag{Eq:InitG}\]

Now, multiply \(s^n\) to the both sides of (Eq:Birthn), and sum over \(n = n_0, n_0 + 1, \cdots\). We have

\[\sum_{n=n_0}^{\infty}\frac{{d}p_n(t)}{{d}t}s^n = \lambda\sum_{n=n_0+1}^{\infty}(n-1)p_{n-1}(t)s^n - \lambda\sum_{n=n_0}^{\infty}np_{n}(t)s^n.\]

The left-hand side: \[\sum_{n=n_0}^{\infty}\frac{{d}p_n(t)}{{d}t}s^n = \frac{\partial G(s,t)}{\partial t}.\]

The first term on the right-hand side:\[\lambda\sum_{n=n_0+1}^{\infty}(n-1)p_{n-1}(t)s^n = \lambda s^2\frac{\partial G(s,t)}{\partial s}.\]

The second term on the right-hand side:\[\lambda\sum_{n=n_0}^{\infty}np_{n}(t)s^n = \lambda s \frac{\partial G(s,t)}{\partial s}.\]

Combining these together, we obtain the following PDE: \[\frac{\partial G(s,t)}{\partial t} =\lambda s(s - 1) \frac{\partial G(s,t)}{\partial s}.\tag{Eq:PDE}\]

Solving the PDE for \(G(s,t)\)

The PDE (Eq:PDE) can be greatly simplified if we can get rid of the factor \(s(s-1)\) on the right-hand side. This can be achieved by the change of variables \(s \to z\) where the new variable \(z\) is defined by the ODE \[\frac{ds}{{d}z} = \lambda s (s-1).\]

Here, we are implicitly assuming that \(0 < s < 1\). Solving this ODE (exercise!), we have

\[s = \frac{1}{1 + e^{\lambda z}}.\]

Now, change the variable \(s\) in \(G(s,t)\) to \(z\) and define

\[Q(z,t) = G(1/(1+e^{\lambda z}), t).\]

In terms of \(Q(z,t)\), (Eq:PDE) becomes \[\frac{\partial Q(z,t)}{\partial t}=\frac{\partial Q(z,t)}{\partial z}.\tag{Eq:PDEQ}\]

The general solution of (Eq:PDEQ) is known to be any differentiable function of the form \(w(z + t)\). In fact, we have 

\[\frac{\partial w(z + t)}{\partial z} = \frac{\partial w(z+t)}{\partial t}\]

for any differentiable function \(w(z+t)\).

Let \(Q(z, t) = w(z+t)\). The functional form of \(w\) is determined by the initial condition (Eq:InitG):

\[G(s,0) = s^{n_0} = \frac{1}{(1+e^{\lambda z})^{n_0}} = Q(z,0) = w(z).\]

Thus,

\[\begin{eqnarray}G(s,t) &=& Q(z,t) = w(z+t) = \frac{1}{(1+e^{\lambda (z+t)})^{n_0}}\nonumber\\    &=& \frac{1}{(1 + \frac{1-s}{s}e^{\lambda t})^{n_0}}  = \frac{s^{n_0}e^{-\lambda n_0 t}}{[1 - (1 -e^{-\lambda t})s]^{n_0}}.\end{eqnarray}\]

By applying the negative binomial theorem to the denominator, we have

\[\begin{eqnarray}G(s,t) &=& \frac{s^{n_0}e^{-\lambda n_0 t}}{[1 - (1 -e^{-\lambda t})s]^{n_0}}\\&=&s^{n_0}e^{-\lambda n_0 t}\sum_{m=0}^{\infty}\binom{m+n_0-1}{n_0 - 1}(1-e^{-\lambda t})^ms^m\\&=&e^{-\lambda n_0 t}\sum_{n=n_0}^{\infty}\binom{n-1}{n_0 - 1}(1-e^{-\lambda t})^{n-n_0}s^n.\end{eqnarray}\]

Comparing the coefficients of \(s^n\) with (Eq:PGF), we have \[\begin{equation} p_n(t) = \begin{cases} 0, & (n < n_0),\\ \binom{n-1}{n_0-1}e^{-\lambda n_0 t}(1 - e^{-\lambda t})^{n-n_0}, & (n \geq n_0). \end{cases} \end{equation}\]

Thus, \(N(t)\) follows the Pascal or negative binomial distribution, \(\mathrm{NB}(n_0, e^{-\lambda t})\). In particular, the mean population size is given by 

\[\mathbb{E}[N(t)] = \frac{\partial G}{\partial s}(1,t) = n_0e^{\lambda t}.\]

We can see that the population size increases exponentially with time.

Related videos





Comments

Popular posts from this blog

Open sets and closed sets in \(\mathbb{R}^n\)

Euclidean spaces

Applications of multiple integrals