Let \(f(x)\) and \(g(x)\) be functions that are continuous on \([a,b]\) and differentiable on \((a,b)\). Suppose that \(g'(x) \neq 0\) for all \(x \in (a,b)\). Then, there exists a \(c\in (a,b)\) such that
Proof. For convenience we set \(f(a) = g(a) = 0\) so that \(f(x)\) and \(g(x)\) are defined on \([a, b)\). Assume conditions 1, 2, and 3 hold.
By condition 1, \(f(x)\) and \(g(x)\) are continuous on \([a,b)\). For any \(x\in(a,b)\), \(f(x)\) and \(g(x)\) are continuous on \([a,x]\) and differentiable on \((a,x)\). By condition 2, for all \(t \in (a,x)\), \(g'(t) \neq 0\). Thus, by Cauchy's mean value theorem, there exists a \(c_x \in (a,x)\) such that
\(c_x \to a + 0\) as \(x \to a +0\) and, by condition 3, \(\lim_{x\to a+0}\frac{f'(c_x)}{g'(c_x)}\) exists. Therefore \(\lim_{x\to a+0}\frac{f(x)}{g(x)}\) also exists and
By the definition of the right limit, for any \(\varepsilon > 0\), there exists a \(\delta_1 > 0\) such that \(a < x < a + \delta_1\) implies \(\left|\frac{f'(x)}{g'(x)} - L\right| < \varepsilon\).
Since \(\lim_{x\to a+0}g(x) = \pm\infty\), there exists \(\delta_2 > 0\) such that \(a < x < a + \delta_2\) implies \(|g(x)| > 1\).
Let \(\delta' = \min\{\delta_1, \delta_2\}\) and \(d = a + \delta'\). By Cauchy's mean value theorem, for all \(x \in (a, d)\), there exists a \(c_x \in (x, d)\) such that
Here, \(f(d)\) and \(g(d)\) are finite constants, and \(\lim_{x\to a+0}\frac{f'(x)}{g'(x)}\) converges to a finite value (by condition 3). Hence, by condition 1' (\(\lim_{x\to a + 0}g(x) = \pm\infty\)), \(\lim_{x\to a+0}r(x) = 0\). In other words, for any \(\varepsilon > 0\), there exists a \(\delta_3 > 0\) such that \(a < x < a + \delta_3\) implies \(|r(x)| < \varepsilon\).
Let \(\delta = \min\{\delta', \delta_3\}\). By Eq. (eq:rx), we have
\[\frac{f(x)}{g(x)} - L = \frac{f'(c_x)}{g'(c_x)} - L + r(x).\]
Proof. We prove the case when \(x \to \infty\) and condition 1 holds.
For the open interval \((b, \infty)\), this \(b\) can be replaced with any real number greater than \(b\). Therefore, without losing generality, we may assume \(b > 0\).
Let \(x = \frac{1}{t}\). As \(x \to \infty\), \(t\to +0\). By conditions 1 and 2,
so that, by condition 3, the right limit \(\lim_{t\to +0}\frac{\frac{d}{dt}f\left(\frac{1}{t}\right)}{\frac{d}{dt}g\left(\frac{1}{t}\right)}\) exists. Therefore, by L'H\^opital's rule (1), we have the limit
Defining the birth process Consider a colony of bacteria that never dies. We study the following process known as the birth process , also known as the Yule process . The colony starts with \(n_0\) cells at time \(t = 0\). Assume that the probability that any individual cell divides in the time interval \((t, t + \delta t)\) is proportional to \(\delta t\) for small \(\delta t\). Further assume that each cell division is independent of others. Let \(\lambda\) be the birth rate. The probability of a cell division for a population of \(n\) cells during \(\delta t\) is \(\lambda n \delta t\). We assume that the probability that two or more births take place in the time interval \(\delta t\) is \(o(\delta t)\). That is, it can be ignored. Consequently, the probability that no cell divides during \(\delta t\) is \(1 - \lambda n \delta t - o(\delta t)\). Note that this process is an example of the Markov chain with states \({n_0}, {n_0 + 1}, {n_0 + 2}...
Generational growth Consider the following scenario (see the figure below): A single individual (cell, organism, etc.) produces \(j (= 0, 1, 2, \cdots)\) descendants with probability \(p_j\), independently of other individuals. The probability of this reproduction, \(\{p_j\}\), is known. That individual produces no further descendants after the first (if any) reproduction. These descendants each produce further descendants at the next subsequent time with the same probabilities. This process carries on, creating successive generations. Figure 1. An example of the branching process. Let \(X_n\) be the random variable representing the population size (number of individuals) of generation \(n\). In the above figure, we have \(X_0 = 1\), \(X_1=4\), \(X_2 = 7\), \(X_3=12\), \(X_4 = 9.\) We shall assume \(X_0 = 1\) as the initial condition. Ideally, our goal would be to find how the population size grows through generations, that is, to find the probability \(\Pr(X_n = k)\) for e...
The birth-death process Combining birth and death processes with birth and death rates \(\lambda\) and \(\mu\), respectively, we expect to have the following differential-difference equations for the birth-death process : \[\begin{eqnarray}\frac{{d}p_0(t)}{{d}t} &=& \mu p_1(t),\\\frac{{d}p_n(t)}{{d}t} &=& \lambda(n-1)p_{n-1}(t) - (\lambda + \mu)np_n(t) + \mu(n+1)p_{n+1}(t),~~(n \geq 1).\end{eqnarray}\] You should derive the above equations based on the following assumptions: Given a population with \(n\) individuals, the probability that an individual is born in the population during a short period \(\delta t\) is \(\lambda n \delta t + o(\delta t)\). Given a population with \(n\) individuals, the probability that an individual dies in the population is \(\mu n \delta t + o(\delta t)\). The probability that multiple individuals are born or die during \(\delta t\) is negligible. (The probability of one birth and one death during \(\delta t\) is also negligible.) Consequ...
Comments
Post a Comment