Let \(f(x)\) and \(g(x)\) be functions that are continuous on \([a,b]\) and differentiable on \((a,b)\). Suppose that \(g'(x) \neq 0\) for all \(x \in (a,b)\). Then, there exists a \(c\in (a,b)\) such that
Proof. For convenience we set \(f(a) = g(a) = 0\) so that \(f(x)\) and \(g(x)\) are defined on \([a, b)\). Assume conditions 1, 2, and 3 hold.
By condition 1, \(f(x)\) and \(g(x)\) are continuous on \([a,b)\). For any \(x\in(a,b)\), \(f(x)\) and \(g(x)\) are continuous on \([a,x]\) and differentiable on \((a,x)\). By condition 2, for all \(t \in (a,x)\), \(g'(t) \neq 0\). Thus, by Cauchy's mean value theorem, there exists a \(c_x \in (a,x)\) such that
\(c_x \to a + 0\) as \(x \to a +0\) and, by condition 3, \(\lim_{x\to a+0}\frac{f'(c_x)}{g'(c_x)}\) exists. Therefore \(\lim_{x\to a+0}\frac{f(x)}{g(x)}\) also exists and
By the definition of the right limit, for any \(\varepsilon > 0\), there exists a \(\delta_1 > 0\) such that \(a < x < a + \delta_1\) implies \(\left|\frac{f'(x)}{g'(x)} - L\right| < \varepsilon\).
Since \(\lim_{x\to a+0}g(x) = \pm\infty\), there exists \(\delta_2 > 0\) such that \(a < x < a + \delta_2\) implies \(|g(x)| > 1\).
Let \(\delta' = \min\{\delta_1, \delta_2\}\) and \(d = a + \delta'\). By Cauchy's mean value theorem, for all \(x \in (a, d)\), there exists a \(c_x \in (x, d)\) such that
Here, \(f(d)\) and \(g(d)\) are finite constants, and \(\lim_{x\to a+0}\frac{f'(x)}{g'(x)}\) converges to a finite value (by condition 3). Hence, by condition 1' (\(\lim_{x\to a + 0}g(x) = \pm\infty\)), \(\lim_{x\to a+0}r(x) = 0\). In other words, for any \(\varepsilon > 0\), there exists a \(\delta_3 > 0\) such that \(a < x < a + \delta_3\) implies \(|r(x)| < \varepsilon\).
Let \(\delta = \min\{\delta', \delta_3\}\). By Eq. (eq:rx), we have
\[\frac{f(x)}{g(x)} - L = \frac{f'(c_x)}{g'(c_x)} - L + r(x).\]
Proof. We prove the case when \(x \to \infty\) and condition 1 holds.
For the open interval \((b, \infty)\), this \(b\) can be replaced with any real number greater than \(b\). Therefore, without losing generality, we may assume \(b > 0\).
Let \(x = \frac{1}{t}\). As \(x \to \infty\), \(t\to +0\). By conditions 1 and 2,
so that, by condition 3, the right limit \(\lim_{t\to +0}\frac{\frac{d}{dt}f\left(\frac{1}{t}\right)}{\frac{d}{dt}g\left(\frac{1}{t}\right)}\) exists. Therefore, by L'H\^opital's rule (1), we have the limit
Defining the birth process Consider a colony of bacteria that never dies. We study the following process known as the birth process , also known as the Yule process . The colony starts with \(n_0\) cells at time \(t = 0\). Assume that the probability that any individual cell divides in the time interval \((t, t + \delta t)\) is proportional to \(\delta t\) for small \(\delta t\). Further assume that each cell division is independent of others. Let \(\lambda\) be the birth rate. The probability of a cell division for a population of \(n\) cells during \(\delta t\) is \(\lambda n \delta t\). We assume that the probability that two or more births take place in the time interval \(\delta t\) is \(o(\delta t)\). That is, it can be ignored. Consequently, the probability that no cell divides during \(\delta t\) is \(1 - \lambda n \delta t - o(\delta t)\). Note that this process is an example of the Markov chain with states \({n_0}, {n_0 + 1}, {n_0 + 2}...
In mathematics, we must prove (almost) everything and the proofs must be done logically and rigorously. Therefore, we need some understanding of basic logic. Here, I will informally explain some rudimentary formal logic. Definitions (Proposition): A proposition is a statement that is either true or false. "True" and "false" are called the truth values, and are often denoted \(\top\) and \(\bot\). Here is an example. "Dr. Akira teaches at UBD." is a statement that is either true or false (we understand the existence of Dr. Akira and UBD), hence a proposition. The following statement is also a proposition, although we don't know if it's true or false (yet): Any even number greater than or equal to 4 is equal to a sum of two primes. See also: Goldbach's conjecture Next, we define several operations on propositions. Note that propositions combined with these operations are again propositions. (Conjunction, logical "and"): Let \(P\)...
Sometimes, we may simplify integration by using the product rule of differentiation. This technique is called integration by parts. Theorem (Integration by parts) Let \(f(x)\) and \(g(x)\) be differentiable functions on an open interval \(I\). Then, \(\int f(x)g'(x)dx = f(x)g(x) - \int f'(x)g(x)dx\); For any \(a, b \in I\), \[\int_a^bf(x)g'(x)dx = \left[f(x)g(x)\right]_a^b - \int_a^bf'(x)g(x)dx.\] Proof . By the product rule, \[[f(x)g(x)]' = f'(x)g(x) + f(x)g'(x)\] so \[f(x)g'(x) = [f(x)g(x)]' - f'(x)g(x).\] By integrating both sides, we have the desired results. ■ Example . Let us find \(\int x\cosh x dx\). \[ \begin{eqnarray*} \int x\cosh x dx &=& \int x(\sinh x)'dx \\ &=& x \sinh x - \int 1 \cdot \sinh x dx\\ &=& x \sinh x - \cosh x + C. \end{eqnarray*} \] Example (eg:recur) . Let us study how we can compute \[I_n = \int \frac{dx}{(x^2 + 1)^n}\] for \(n\in \mathbb{N}\). Note \[I_{n} = \int \fr...
Comments
Post a Comment