Introductory university-level calculus, linear algebra, abstract algebra, probability, statistics, and stochastic processes.
Linear independence
Get link
Facebook
X
Pinterest
Email
Other Apps
-
We will work with row vectors in . Consider a set of vectors. If one of these vectors can be expressed as a linear combination of the other vectors, these vectors are said to be linearly dependent. If none of these vectors can be expressed as a linear combination of the other vectors, they are said to be linearly independent. We show that the determinant of a matrix is zero if its row vectors are linearly dependent and vice versa.
Definition (Linear dependence)
We say that a finite sequence of vectors is linearly dependent if there are real numbers which are not all 0 (i.e., some are non-zero), such that
Why this terminology? Suppose . Then we can rearrange the above equation into
That is, can be expressed as a linear combination of the other vectors. In this sense, the vector depends on the other vectors.
Example. and are linearly dependent because
□
Definition (Linear independence)
If a finite sequence of vectors is not linearly dependent, we say that it is linearly independent.
Remark. Therefore, are linearly independent if and only if
implies □
Example. and are linearly independent vectors. (Verify this!) □
Theorem
Suppose that are linearly independent. If there is a vector such that
for some , then this representation is unique.
Remark. What this uniqueness means is the following. If can be represented in terms of another linear combination of , say,
then, we have □
Proof. Suppose there is an alternative representation
Then subtracting both sides, we have
Since are linearly independent by assumption, it follows that , and hence for all as required. ■
Linear independence and matrix determinant
Theorem
Let
be an matrix where are -dimensional row vectors. If these row vectors are linearly dependent, then .
Proof. Since the row vectors are linearly dependent, at least one of them can be expressed as a linear combination of the other vectors. Without the loss of generality, assume with not all being equal to 0.
Then, by the properties of determinants, we have
In the last sum, each determinant contains two rows of the same vectors (i.e., is equal to one of ). Therefore, by the property of determinants, all the terms are 0. ■
Example. Consider the matrix determinant
We know that the row vectors are linearly dependent (see the example above), and hence, the determinant is zero. □
Corollary
Let
be an matrix where are -dimensional row vectors. If , then these row vectors are linearly independent.
Proof. The contrapositive of the above theorem. ■
The converse is also true: If the row vectors are linearly independent, then . However, proving this is beyond the scope of this module. See some textbooks on Linear Algebra (We need to define the notion of the rank of a matrix).
Defining the birth process Consider a colony of bacteria that never dies. We study the following process known as the birth process , also known as the Yule process . The colony starts with cells at time . Assume that the probability that any individual cell divides in the time interval is proportional to for small . Further assume that each cell division is independent of others. Let be the birth rate. The probability of a cell division for a population of cells during is . We assume that the probability that two or more births take place in the time interval is . That is, it can be ignored. Consequently, the probability that no cell divides during is . Note that this process is an example of the Markov chain with states \({n_0}, {n_0 + 1}, {n_0 + 2}...
Generational growth Consider the following scenario (see the figure below): A single individual (cell, organism, etc.) produces descendants with probability , independently of other individuals. The probability of this reproduction, , is known. That individual produces no further descendants after the first (if any) reproduction. These descendants each produce further descendants at the next subsequent time with the same probabilities. This process carries on, creating successive generations. Figure 1. An example of the branching process. Let be the random variable representing the population size (number of individuals) of generation . In the above figure, we have , , , , We shall assume as the initial condition. Ideally, our goal would be to find how the population size grows through generations, that is, to find the probability for e...
In mathematics, we must prove (almost) everything and the proofs must be done logically and rigorously. Therefore, we need some understanding of basic logic. Here, I will informally explain some rudimentary formal logic. Definitions (Proposition): A proposition is a statement that is either true or false. "True" and "false" are called the truth values, and are often denoted and . Here is an example. "Dr. Akira teaches at UBD." is a statement that is either true or false (we understand the existence of Dr. Akira and UBD), hence a proposition. The following statement is also a proposition, although we don't know if it's true or false (yet): Any even number greater than or equal to 4 is equal to a sum of two primes. See also: Goldbach's conjecture Next, we define several operations on propositions. Note that propositions combined with these operations are again propositions. (Conjunction, logical "and"): Let ...
Comments
Post a Comment