Posts

Showing posts from September, 2023

Asymptotic expansion (Taylor approximation)

Image
In many situations, the remainder term in the finite Taylor (Maclaurin) expansion is unimportant. To denote that some terms are not as important as others, we introduce a new notation of great convenience called the "little o."  Using this little o notation, we define the asymptotic expansion, which is (almost) the same as the finite Taylor expansion except that the remainder term is replaced by the little o term. See also : Taylor's theorem Definition (Landau's asymptotic (``little \(o\)'') notation) Let \(f(x)\) and \(g(x)\) be functions defined in the neighbor of \(x=a\). If \[\lim_{x\to a}\frac{f(x)}{g(x)} = 0,\] then, we write \[f(x) = o(g(x)) ~~ (x \to a).\] This ``\(o\)'' is called Landau's symbol} (or ``little o''), and this notation is called Landau's notation (or little-o notation). Remark . When an equation involves Landau's symbol, it does not represent exact equality. □ Example .  \(f(x) = o(1) ~ (x \to a)\) means \(\li

Taylor's theorem

Image
The essence of differentiation is to approximate an arbitrary function by a linear function. We can extend this idea by using higher-order derivatives to obtain better approximations. Theorem (Taylor's theorem) Let \(f(x)\) be a function of class \(C^{n}\) on an open interval \(I\). Let \(a \in I\). Then for all \(x \in I\), there exists \(c_x\) between \(x\) and \(a\) such that \[ \begin{eqnarray} f(x) &=& f(a) + f'(a)(x-a) + \frac{1}{2}f''(a)(x-a)^2 + \cdots\nonumber\\ && + \frac{1}{(n-1)!}f^{(n)}(a)(x-a)^{n-1} + \frac{1}{n!}f^{(n)}(c_x)(x-a)^{n}.\nonumber\\ \label{eq:taylor} \end{eqnarray} \] Proof . If \(x=a\), then we can set \(c_x = a\) and Eq. (eq:taylor) clearly holds. Suppose \(b \in I\), \(b\neq a\). We need to show \[ \begin{eqnarray} f(b) &=& f(a) + f'(a)(b-a) + \frac{1}{2}f''(a)(b-a)^2 + \cdots\nonumber\\ && + \frac{1}{(n-1)!}f^{(n-1)}(a)(b-a)^{n-1} + \frac{1}{n!}f

Proof of L'Hôpital's rule

Image
To prove L'Hôpital's rule, we first prove Cauchy's mean value theorem that generalizes the mean value theorem provided earlier. See also : Mean Value Theorem Theorem (Cauchy's Mean Value Theorem) Let \(f(x)\) and \(g(x)\) be functions that are continuous on \([a,b]\) and differentiable on \((a,b)\). Suppose that \(g'(x) \neq 0\) for all \(x \in (a,b)\).  Then, there exists a \(c\in (a,b)\) such that \[\frac{f'(c)}{g'(c)} = \frac{f(b) - f(a)}{g(b) - g(a)}.\] Proof . Note that the function \(g(x)\) satisfies the conditions of the mean value theorem. Thus, there exists a \(d\in(a,b)\) such that \[g'(d) = \frac{g(b) - g(a)}{b - a}.\] Since \(g'(d) \neq 0\) by assumption, it follows that \(g(b) - g(a) \neq 0\). Now let us define \[h(x) = f(x) - \frac{f(b) - f(a)}{g(b) - g(a)}g(x).\] Then \(h(x)\) is continuous on \([a,b]\) and differentiable on \((a,b)\). Moreover, \[h(a) = \frac{f(a)g(b) - f(b)g(a)}{g(b) - g(a)} = h(b).\] Therefore, by Rolle's theore

L'Hôpital's rule

Image
We have been using the following formula without proof so far: \[\lim_{x\to 0}\frac{\sin x}{x} = 1.\] In this example, both \(\sin x\) and \(x\) converges to 0 as \(x \to 0\) so we have something like \(\frac{0}{0}\). In general, if \(\lim_{x\to a}f(x) = \lim_{x\to a}g(x) = 0\) or \(\lim_{x\to a}f(x) = \lim_{x\to a}g(x) = \pm\infty\), the limit of the form \(\lim_{x\to a}\frac{f(x)}{g(x)}\) is called an indeterminate form . L'Hôpital's rule provides a convenient way to calculate such limits (The proof will be given in another post). See also : Proof of L'Hôpital's rule Theorem (L'Hôpital's rule (1)) Let \(f(x)\) and \(g(x)\) be differentiable functions on the open interval \((a,b)\) that satisfy the following conditions. \[\lim_{x\to a+0}f(x) = \lim_{x\to a+0}g(x) = 0.\] For all \(x \in (a,b)\), \(g'(x) \neq 0\). The right limit \[\lim_{x\to a+0}\frac{f'(x)}{g'(x)}\] exists. Then, the right limit \(\lim_{x\to a+0}\frac{f(x)}{g(x)}\) exists and \[\lim

Newton's method

Image
Newton's method (the Newton-Raphson method) is a very powerful numerical method for solving nonlinear equations.  Suppose we'd like to solve a non-linear equation \(f(x) = 0\), where \(f(x)\) is a (twice) differentiable non-linear function. Newton's method generates a sequence of numbers \(c_1, c_2, c_3, \cdots\) that converges to a solution of the equation. That is, if \(\alpha\) is a solution (i.e., \(f(\alpha) = 0\)) then,  \[\lim_{n\to\infty}c_n = \alpha,\] and this sequence \(\{c_n\}\) is generated by a series of linear approximations of the function \(f(x)\). Theorem (Newton's method) Let \(f(x)\) be a function that is twice differentiable on an open interval \(I\) that contains the closed interval \([a, b]\) (i.e., \([a,b]\subset I\)) and satisfy the following conditions: \(f(a) < 0\) and \(f(b) > 0\); For all \(x\in [a, b]\), \(f'(x) > 0\) and \(f''(x) > 0\). Let us define the sequence \(\{c_n\}\) by \[ \begin{eqnarray} c_1 &

Mean Value Theorem

Image
The mean value theorem (MVT) says that, for a given arc connecting two points of a function, there is at least one point at which the slope of the tangent line is equal to the slope of the arc. Theorem (Rolle's theorem) Let \(f(x)\) be a continuous function defined on a closed interval \([a, b]\). Suppose \(f(x)\) is differentiable on the open interval \((a, b)\) and \(f(a) = f(b)\). Then there exists a \(c\in (a,b)\) such that \(f'(c) = 0\). Proof . If the (global) maximum and minimum are both at \(x=a\) and \(x=b\), then \(f(x)\) is constant since \(f(a) = f(b)\) (= maximum = minimum). In this case, \(f'(x)=0\) at all \(x \in (a,b)\). Otherwise, \(f(x)\) has a maximum or minimum value at some \(c \in (a,b)\) so that \(f'(c) = 0\). ■ Theorem (Mean value theorem) Let \(f(x)\) be a continuous function on \([a,b]\) such that it is differentiable on \((a,b)\). Then there exists a \(c \in (a,b)\) such that \[f'(c) = \frac{f(b) - f(a)}{b - a}.\] Proof . Let us define \[g

Local maximum and local minimum

Image
Derivatives can be used to find a function's local extremal values (i.e., local maximum or minimum values). If \(f(x)\) is a continuous function defined on a closed interval \([a,b]\), then it always has a maximum value and a minimum value (c.f. the Extreme Value Theorem). Finding a maximum or minimum of a function on its entire domain is a global problem. On the other hand, examining continuity or differentiability around \(x=a\) is a local problem in the sense that it only concerns the neighbor of the point \(x = a\). We may also define the notions of local maxima and local minima . Definition (Local maximum and local minimum) Let \(f(x)\) be a function defined on an interval \(I\) and let \(a \in I\). \(f(a)\) is said to be a local maximum value of the function \(f(x)\) if there exists  \(\delta > 0\) such that, for all \(x\in (a - \delta, a + \delta) \cap I\), \(x \neq a\) implies \(f(x) < f(a)\). \(x= a\) is said to be a local maximum point if \(f(a)\) is a local ma

Properties of differentiation

Image
 We show some basic properties of differentiation, such as linearity, the product rule, the quotient rule, the chain rule, etc. We also introduce higher-order derivatives and differentiability classes. Theorem (Properties of differentiation) Let \(f(x)\) and \(g(x)\) be differentiable functions on an open interval \(I\). (Linearity) \[\frac{d}{dx}[kf(x) + lg(x)] = kf'(x) + lg'(x)\] where \(k, l\) are constants.  (Product rule) \[\frac{d}{dx}[f(x)g(x)] = f'(x)g(x) + f(x)g'(x).\]  (Quotient rule) \(\frac{f(x)}{g(x)}\) is differentiable on \(\{x \mid g(x) \neq 0, x\in I\}\) and \[\frac{d}{dx}\left[\frac{f(x)}{g(x)}\right] = \frac{f'(x)g(x) - f(x)g'(x)}{[g(x)]^2}.\] Proof .  Exercise. Since \(f(x)\) is differentiable at \(x=a\), it is continuous at \(x=a\) so \(f(x) \to f(a)\) as \(x \to a\). For any \(a\in I\), \[\begin{eqnarray*} \frac{f(x)g(x) - f(a)g(a)}{x - a} &=& \frac{f(x)g(x) - f(x)g(a) + f(x)g(a) - f(a)g(a)}{x - a}\\ &=& f(x)\cdot\frac{g(x)

Differentiation

Image
The essence of ``differentiation'' is approximating arbitrary functions by linear functions. Definition (Differentiability and derivative) Let \(f(x)\) be a function defined around \(x=a\). We say \(f(x)\) is differentiable at \(x=a\) if the following limit exists: \[\lim_{x\to a}\frac{f(x) - f(a)}{x - a} = \lim_{h\to 0}\frac{f(a+h) - f(a)}{h}.\] This limit value is called the differential coefficient of \(f(x)\) at \(x=a\), and denoted by \[f'(a) \text{ or } \frac{df}{dx}(a) \text{ or } \frac{d}{dx}f(a).\] If \(f(x)\) is defined on an open interval \(I\) and is differentiable at all points in \(I\), then \(f(x)\) is said to be differentiable on \(I\). In this case, we can make correspondence between each \(c\in I\) and \(f'(c)\), which defines a function on \(I\). Such a function is called the derivative of \(f(x)\) and denoted by \[f'(x) \text{ or } \frac{df}{dx}(x) \text{ or } \frac{d}{dx}f(x).\] Remark . We also use the verb ``differentiate'' to mean

More on the limit of functions

Image
 There are a few variations of limits. Definition (Divergence of a function) If the function \(f(x)\) does not converge to any value as \(x \to a\), we say that \(f(x)\) diverges as \(x \to a\). In particular, if the value of \(f(x)\) increases arbitrarily as \(x \to a\), we say that \(f(x)\) diverges to positive infinity and write \[\lim_{x\to a}f(x) = \infty.\] If the value of \(f(x)\) is negative and its absolute value increases arbitrarily, we say that \(f(x)\) diverges to negative infinity and write \[\lim_{x\to a}f(x) = -\infty.\] Example . Consider the function \[f(x) = \frac{1}{(x - 1)^2}\] defined on \(\mathbb{R}\setminus\{1\}\). We have \[\lim_{x \to 1}f(x) = \infty.\] You should verify this by drawing a graph. □ There are a few variants of the notion of limit. Definition (Left and right limits) We define the left limit of the function \(f(x)\) at \(x=a\) \[\lim_{x \to a -0}f(x) = \alpha\] if the following is satisfied For any \(\varepsilon > 0\), there exists \(\delt

Properties of the limit of functions

Image
The limit of functions has properties that are similar to the limit of sequences. Theorem (Properties of the limit of function) Let \(f(x)\) and \(g(x)\) be functions such that \(\lim_{x\to a}f(x) = \alpha\) and \(\lim_{x\to a}g(x) = \beta\). The following hold. For any constants \(k, l\in\mathbb{R}\), \[\lim_{x\to a}(kf(x) + lg(x)) = k\alpha + l\beta.\] \[\lim_{x\to a}f(x)g(x) = \alpha\beta.\] If \(\beta \neq 0\), \[\lim_{x\to a}\frac{f(x)}{g(x)} = \frac{\alpha}{\beta}\] Proof . In the following \(\varepsilon\) is always an arbitrary positive real number, and \(D_f\) and \(D_g\) are the domains of the functions \(f\) and \(g\), respectively. 1. Let \(M = \max\{|k|, |l|\} + 1\). Then \(\frac{\varepsilon}{2M}\) is also positive and real. Since \(\lim_{x\to a}f(x) = \alpha\), there exists \(\delta_1 > 0\) such that for any \(x\in D_f\) if \(0 < |x - a| < \delta_1\) then \(|f(x) - \alpha| < \frac{\varepsilon}{2M}\). Similarly, there exists \(\delta_2 > 0\) such that for any

Limit of a univariate function

Image
Let \(f(x)\) be a function. Suppose we move \(x\in\mathbb{R}\) towards \(a\) while keeping \(x \neq a\). If in this case \(f(x)\) approaches a constant value \(\alpha\) irrespective of the way how \(x\) approaches \(a\), we say that \(f(x)\) converges to \(\alpha\) as \(x \to a\) and write \[\lim_{x\to a}f(x) = \alpha\] or \[f(x) \to \alpha \text{ as \(x \to a\)}.\] Remark . \(a\) needs not belong to \(\text{dom}(f)\) (the domain of \(f\)) as long as \(x\) can approach \(a\) arbitrarily closely. □ But what does this mean exactly? Here's a rigorous definition in terms of what is called the \(\varepsilon-\delta\) argument. Definition (Limit of a function) We say that the function \(f(x)\) converges to \(\alpha\) as \(x \to a\) and write \[\lim_{x\to a}f(x) = \alpha\] if the following condition is satisfied. For any \(\varepsilon > 0\),  there exists \(\delta > 0\) such that, for all \(x\in \text{dom}(f)\), if \(0 < |x - a| < \delta\) then \(|f(x) - \alpha| < \varepsil