Posts

A continuous function on a closed interval is uniformly continuous

Image
The notion of  uniform continuity  is a ``stronger'' version of (simple) continuity. If a function is uniformly continuous, it is continuous, but the converse does not generally hold (that is, a continuous function may not be uniformly continuous). However, if we restrict a continuous function on a closed interval, it is always uniformly continuous. Definition (Uniform continuity) The function \(f(x)\) on an interval \(I\) is said to be  uniformly continuous  on \(I\) if it satisfies the following condition. For any \(\varepsilon > 0\), there exists \(\delta > 0\), such that, for all \(x, y\in I\), \(|x - y| < \delta\) implies \(|f(x) - f(y)|< \varepsilon\).   In a logical form, this condition is expressed as  \[ \forall \varepsilon > 0, \exists \delta > 0, \forall x,y\in I ~ (|x-y| < \delta \implies |f(x) - f(y)| < \varepsilon).\label{eq:unifcont} \] Remark . Compare the above condition for uniform continuity with the condition for the continuous functi

Marginal distributions of the multinomial normal distribution

Image
 Marginal distributions of a multivariate normal distribution are also normal distributions. Let's prove this. See also : Multivariate normal distribution [Wikipedia] The density function of a multivariate normal distribution is given as \[ f(\mathbf{x}) = \frac{1}{\sqrt{(2\pi)^n|\Sigma|}}\exp\left[-\frac{1}{2}(\mathbf{x}-\boldsymbol{\mu})^{\top}\Sigma^{-1}(\mathbf{x}-\boldsymbol{\mu})\right] \] where \(\mathbf{x}\in\mathbb{R}^n\) is the random vector, \(\boldsymbol{\mu}\) is the mean vector and \(\Sigma\) is the covariance matrix. By changing the variables \(\mathbf{x} - \boldsymbol{\mu} \mapsto \mathbf{x}\), we can assume the mean is zero without losing generality. So, in the following, we only consider \[ f(\mathbf{x}) = \frac{1}{\sqrt{(2\pi)^n|\Sigma|}}\exp\left[-\frac{1}{2}\mathbf{x}^{\top}\Sigma^{-1}\mathbf{x}\right]. \] We need the following theorems from linear algebra. Theorem 1 Let \(A\) be an \(n\times n\) regular, \(D\) be \(m\times m\) regular, \(B\

Any continuous function is Riemann-integrable on a closed interval

Image
The goal of this post is to prove one of the practical foundations of Riemann-integrals. Theorem A function that is continuous on \([a,b]\) is integrable on \([a,b]\).   See also :  Riemann integral See also : Uniformly continuous functions Proof . Let \(f(x)\) be continuous on \([a,b]\). By the above theorem, \(f(x)\) is uniformly continuous on \([a,b]\). Therefore, for any \(\varepsilon > 0\), there exists a \(\delta > 0\) such that, for all \(x, y\in [a,b]\), \(|x-y| < \delta\) implies \(|f(x) - f(y)| < \frac{\varepsilon}{b - a}\). Let \(\Delta\) be a partition of \([a,b]\) such that \(a = x_0 < x_1 < \cdots < x_{n-1} < x_n = b\) and its mesh is less than \(\delta\) (i.e., \(x_{i+1} - x_{i} < \delta\) for all \(i= 0, 1, \cdots, n-1\)). Then, for each \(i=0, 1, \cdots, n-1\), if \(x, y \in [x_{i}, x_{i+1}]\), then \(|f(x) - f(y)| < \frac{\varepsilon}{b - a}\). Hence, if we define \[ \begin{eqna

Applications of integrals (2): Gamma and Beta functions

Image
The use of integrals is not limited to computing areas and lengths. Integrals are also helpful for defining new functions. Here, we study two special functions : the Gamma and Beta functions. These functions are widely used in various fields of science and engineering, as well as statistics. Gamma function Lemma For any \(s > 0\), the improper integral \(\int_0^{\infty}e^{-x}x^{s-1}dx\) converges. Proof . Let \(f(x) = e^{-x}x^{s-1}\). We decompose the given integral into \(\int_{0}^{1}f(x)dx\) and \(\int_1^{\infty}f(x)dx\) and show that both of them converge. First, consider \(f(x)e^{\frac{x}{2}} = \frac{x^{s-1}}{e^{\frac{x}{2}}}\) on \([1,\infty)\). If we take \(n\in\mathbb{N}\) such that \(n \geq s-1\), then \[f(x)e^{\frac{x}{2}} = \frac{x^{s-1}}{e^{\frac{x}{2}}} \leq \frac{x^{n}}{e^{\frac{x}{2}}}.\] Applying L'Hôpital's rule \(n\) times, we can see that \(\lim_{x\to\infty}\frac{x^n}{e^{\frac{x}{2}}} = 0\). Hence \(\lim_{x\to\infty}f(x)e^{\frac{x}{2}} = 0\). In particular,

Applications of integrals (1): Length of a curve

Image
As an application of integrals, we consider the length of a curve. Specifically, we consider curves in the 2-dimensional space defined parametrically. Let \(x(t)\) and \(y(t)\) be \(C^1\) functions defined on an interval containing the closed interval \([a,b]\). If \(t\) moves in \([a,b]\), the point \((x(t), y(t))\) on \(\mathbb{R}^2\) moves smoothly, drawing a curve. Let us denote this curve by \(C\). Let \(P = (x(a), y(a))\) and \(Q = (x(b), y(b))\) be the end points of the curve \(C\). We want to measure the ``length'' of the curve \(C\). But what is the length of a \emph{curve}, anyway? We do know how to calculate the length of a line segment (Pythagorean theorem). So, let us approximate the curve by line segments. Consider the partition of the closed interval \([a,b]\): \[\Delta: a = t_0 < t_1 < t_2 < \cdots < t_{n-1} < t_n = b.\] Then \(P_0 = (x(t_0), y(t_0)) = P\), \(P_1 = (x(t_1), y(t_1))\), \(P_2 = (x(t_2), y(t_2))\), \(\cdots\), \(P_{n-1} = (x(t_{n-1})

Improper integrals

Image
As we have seen so far, the definite integral \(\int_{a}^{b}f(x)dx\) is defined for the (continuous) function \(f(x)\) on a bounded closed interval \([a,b]\). It is not defined on semi-open intervals such as \((a, b]\) or \([a,b)\), or on unbounded intervals such as \([a,\infty)\) or \((-\infty, \infty)\). Nevertheless, we may extend the definition of definite integrals to deal with such cases. For example, the function \(f(x)\) on \([a, b)\) is not defined on \(x = b\), but if the left limit \(\lim_{t\to b-0}\int_a^tf(x)dx\) exists, we may define it as \(\int_a^bf(x)dx\). Such an extended notion of the integral is called the improper integral . Integration on semi-open intervals For the continuous function \(f(x)\) on the semi-open interval \([a, b)\), if the limit \[\lim_{t\to b - 0}\int_a^tf(x)dx = \lim_{\varepsilon\to +0}\int_a^{b - \varepsilon}f(x)dx\] exists, we say that the improper integral \(\int_a^bf(x)dx\) converges. Similarly, for the continuous function \(f(x)\) on \([a,

Computing integrals (4): Rational functions

Image
 Recall that a rational function is a function of the form \(p(x)/q(x)\) where \(p(x)\) and \(q(x)\) are polynomial functions with real coefficients. We now consider the integration of such functions in general. A polynomial function \[f(x) = a_nx^n + a_{n-1}x^{n-1} + \cdots + a_1x + a_0 ~~ (a_0, a_1,\cdots, a_n\in\mathbb{R}),\] has an anti-derivative \[\int f(x)dx = \frac{a_n}{n+1}x^{n+1} + \frac{a_{n-1}}{n}x^{n} + \cdots + \frac{a_1}{2}x^2 + a_0x.\] Thus, the anti-derivative of a polynomial function is a polynomial function. What about rational functions? In general, anti-derivatives of a rational function may not be a rational function, but a sum of rational functions, logarithm, and inverse trigonometric functions. We use the following lemma (proof is omitted) to show this. Lemma (Partial fraction decomposition) Any rational function can be decomposed into a finite sum of rational functions of the following three forms: polynomials, \[\frac{k}{(x+a)^n}\] where \(a, k\in\mathbb{R},