Any covariance matrix is positive semi-definite: A proof
Let be a vector of random variables. The covariance matrix of is a square ( ) matrix whose elements are covariances between the components of . That is,
where
Here,
- Symmetric. That is,
- Positive semi-definite. That is,
See also: Positive definite matrix (Wolfram MathWorld)
The symmetry is obvious from the definition of the covariance matrix.
Now, let us prove that the covariance matrix is positive semi-definite. First, note that the covariance matrix can be expressed as
where is the density of , and
is the expectation value of .
Let be an arbitrary vector. We have
because the integrand is non-negative.
Remark (2022-10-14): In the initial version of this post, I used the eigenvalue decomposition of the covariance matrix to prove the positive semi-definiteness. But that is not only unnecessary but overly complicated. The above proof is simpler and more direct.
Comments
Post a Comment