Euclidean spaces

We would like to study multivariate functions (i.e., functions of many variables), continuous multivariate functions in particular. To define continuity, we need a measure of "closeness" between points. One measure of closeness is the Euclidean distance. The set \(\mathbb{R}^n\) (with \(n \in \mathbb{N}\)) with the Euclidean distance function is called a Euclidean space. This is the space where our functions of interest live.

The real line is a geometric representation of \(\mathbb{R}\), the set of all real numbers. That is, each \(a \in \mathbb{R}\) is represented as the point \(a\) on the real line.

The coordinate plane, or the \(x\)-\(y\) plane, is a geometric representation of \(\mathbb{R}^2\), the set of all pairs of real numbers. Each pair of real numbers \((a, b)\) is visualized as the point \((a, b)\) in the plane.

Remark. Recall that \(\mathbb{R}^2 = \mathbb{R}\times\mathbb{R} = \{(x, y) | x, y \in \mathbb{R}\}\) is the Cartesian product of \(\mathbb{R}\) with itself, its elements such as \((x, y)\) are ordered pairs of real numbers.

The coordinate space, or the \(x\)-\(y\)-\(z\) space, is a geometric representation of \(\mathbb{R}^3\), the set of all triples of real numbers.

Remark. Recall that \(\mathbb{R}^3 = \mathbb{R}\times\mathbb{R}\times\mathbb{R} = \{(x, y, z) | x, y, z \in \mathbb{R}\}\) is also a Cartesian product, its elements such as \((x, y, z)\) are ordered triples of real numbers.

We can naturally extend this idea. For any \(n\in\mathbb{N}\), we can consider an \(n\)-tuple of real numbers \((a_1, a_2, \cdots, a_n)\) and the set \(\mathbb{R}^n\) of all such \(n\)-tuples. We can "visualize" each element of \(\mathbb{R}^n\) as a "point"' in the \(n\)-dimensional space. For example, \((a_1, a_2, \cdots, a_n) \in \mathbb{R}^n\) is a point where the \(x_1\)-coordinate is \(a_1\), \(x_2\)-coordinate is \(a_2\), and so on.

Univariate functions (i.e., functions with one variable) are often defined on an interval. We would like to extend the notion of an interval to \(\mathbb{R}^n\). But first, we need the notion of distance.

Definition (Euclidean distance)

  Let \(x = (x_1, x_2, \cdots, x_n)\) and \(y = (y_1, y_2, \cdots, y_n)\) be points in \(\mathbb{R}^n\). The (Euclidean) distance \(d(x,y)\) between \(x\) and \(y\) is defined as \[d(x,y) = \sqrt{\sum_{i=1}^{n}(x_i - y_i)^2}.\tag{Eq:Distance}\]

Remark. The distance \(d(x,y)\) is also denoted as \(\|x - y\|\) or \(\|x - y \|_2\). Recall the definition of the length of an \(n\)-dimensional vector.

Definition (Euclidean space)

The set \(\mathbb{R}^n\) equipped with the distance defined in (Eq:Distance) is called the \(n\)-dimensional Euclidean space.
Remark. Sometimes, we say the pair \((\mathbb{R}^n, d)\), where \(d\) is the distance function, is the Euclidean space.
Remark. In mathematics, we generally use the term space to mean a set with some "structure." In the case of Euclidean space, the "structure" is specified by the distance. Other examples of spaces include vector space, probability space, topological space, Hilbert space, etc.

You might have learned the following lemma in Linear Algebra:

Lemma (Cauchy-Schwarz inequality)

For \(a = (a_1, a_2, \cdots, a_n), b = (b_1, b_2, \cdots, b_n) \in \mathbb{R}^n\), we have
\[\left|\sum_{i=1}^{n}a_ib_i\right| \leq \sqrt{\sum_{i=1}^{n}a_i^2}\sqrt{\sum_{i=1}^{n}b_i^2}.\tag{Eq:CauchySchwarzIneq}\]
Proof. The result is trivial if \(a = (0, 0, \cdots, 0)\). Suppose \(a \neq (0, 0, \cdots, 0)\). For any \(t \in \mathbb{R}\), 
\[\begin{eqnarray} 0 &\leq & \sum_{i=1}^{n}(a_it+b_i)^2\\ &=& \left(\sum_{i=1}^{n}a_i^2\right)t^2 + 2\left(\sum_{i=1}^{n}a_ib_i\right)t + \left(\sum_{i=1}^{n}b_i^2\right).   \end{eqnarray}\]
The last quadratic form of \(t\) has at most one real root because it is non-negative;  hence its discriminant is non-positive:
\[\left(\sum_{i=1}^{n}a_ib_i\right)^2 - \left(\sum_{i=1}^{n}a_i^2\right)\left(\sum_{i=1}^{n}b_i^2\right) \leq 0,\]
from which we conclude that
\[\left(\sum_{i=1}^{n}a_ib_i\right)^2 \leq \left(\sum_{i=1}^{n}a_i^2\right)\left(\sum_{i=1}^{n}b_i^2\right).\]
Taking the square root of both sides, we have (Eq:CauchySchwarzIneq). ■
Remark: If we regard \(a, b\in\mathbb{R}^n\) as vectors in a vector space with the scalar product (i.e., dot product) \(\langle a, b\rangle\) and induced norm \(\|\cdot\|\), the Cauchy-Schwarz inequality reads:
\[|\langle a, b\rangle| \leq \|a\|\|b\|.\]

Theorem (Distance axioms)

  1. (Non-negativity) For any \(x, y \in \mathbb{R}^n\), \(d(x,y) \geq 0\). In particular, \(d(x,y) = 0\) if and only if \(x = y\).
  2. (Symmetry) For any \(x, y\in\mathbb{R}^n\), \(d(x,y) = d(y,x)\).
  3. (Triangle inequality) For \(x,y,z\in \mathbb{R}^n\), \[d(x,z) \leq d(x,y) + d(y,z).\]
Proof. (1) and (2) are trivial. We show (3) only.
Let \(x = (x_1, x_2, \cdots, x_n), y = (y_1, y_2, \cdots, y_n), z=(z_1, z_2,\cdots,z_n)\), and \(a_i = x_i - y_i, b_i = y_i - z_i\). Note that \(x_i - z_i = (x_i - y_i) + (y_i - z_i) = a_i + b_i\). Then, we need to prove
\[\sqrt{\sum_{i=1}^{n}(a_i + b_i)^2} \leq \sqrt{\sum_{i=1}^{n}a_i^2} + \sqrt{\sum_{i=1}^{n}b_i^2}.\] 
Noting that both sides are non-negative, squaring both sides gives
\[\sum_{i=1}^{n}(a_i + b_i)^2 \leq \sum_{i=1}^{n}a_i^2 + \sum_{i=1}^{n}b_i^2 + 2\sqrt{\sum_{i=1}^{n}a_i^2}\sqrt{\sum_{i=1}^{n}b_i^2}.\tag{Eq:Ineq1}\]
The left-hand side is
\[\sum_{i=1}^{n}(a_i + b_i)^2 = \sum_{i=1}^{n}a_i^2 + \sum_{i=1}^{n}b_i^2 + 2\sum_{i=1}^{n}a_ib_i.\] Canceling common terms, the above inequality (Eq:Ineq1) to be proved becomes
\[\sum_{i=1}^{n}a_ib_i \leq \sqrt{\sum_{i=1}^{n}a_i^2}\sqrt{\sum_{i=1}^{n}b_i^2}.\]
But this is trivial from the Cauchy-Schwarz inequality (Eq:CauchySchwarzIneq) in the above Lemma. Now, trace this argument backward, and the triangle inequality follows from the Cauchy-Schwarz inequality. 

Remark. For any set \(S\), if a function \(d: S\times S \to \mathbb{R}\) satisfies the above properties of distance, then this function \(d\) may be considered as a distance function in \(S\). The above properties can be used as axioms to define a distance in any set (if possible). Generally, a set \(S\) with a distance function \(d\) is called a metric space.

Example. For \(x, y \in \mathbb{R}^n\), let us define the following function:
\[d_1(x,y) = \sum_{i=1}^{n}|x_i - y_i|.\] This function satisfies all the distance axioms. Thus, \((\mathbb{R}^n, d_1)\) is a metric space. The function \(d_1\) is sometimes called the L1 distance. In comparison, the Euclidean distance is also called the L2 distance.

Definition (\(\varepsilon\)-neighbor)

For \(x \in \mathbb{R}^n\) and \(\varepsilon > 0\), the \(\varepsilon\)-neighbor of \(x\) is defined as
\[N_{\varepsilon}(x) = \{y \in \mathbb{R}^n\mid d(x, y) < \varepsilon\}.\]
\(N_{\varepsilon}(x)\) is also called the open ball with radius \(\varepsilon\) centered at \(x\).

Example
  • In \(\mathbb{R}\), \(N_{\varepsilon}(x) = (x - \varepsilon, x + \varepsilon)\) is an open interval.
  • In \(\mathbb{R}^2\), let \(a = (a_1, a_2)\). Then, \(N_{\varepsilon}(a) = \{(x_1,x_2) \in \mathbb{R}^2 | (x_1 - a_1)^2 + (x_2 - a_2)^2 < \varepsilon^2\}\) is the interior of the circle with radius \(\varepsilon\) centered at \(a\).
  • In \(\mathbb{R}^3\), an \(\varepsilon\)-neighbor is the interior of a sphere.


Comments

Popular posts from this blog

Open sets and closed sets in \(\mathbb{R}^n\)

Applications of multiple integrals