
decomposed into an expression using only real-valued eigenvectors and eigenvalues:
A = QΛQ
,
where Q is an orthogonal matrix composed of eigenvectors of A, and Λ is a diagonal
matrix, with λ
i,i
being the eigenvalue corresponding to Q
:,i
.
While any real symmetric matrix A is guaranteed to have an eigendecomposition,
the eigendecomposition is not unique. If any two or more eigenvectors share the same
eigenvalue, then any set of orthogonal vectors lying in their span are also eigenvectors
with that eigenvalue, and we could equivalently choose a Q using those eigenvectors
instead. By convention, we usually sort the entries of Λ in descending order. Under this
convention, the eigendecomposition is unique only if all of the eigenvalues are unique.
The eigendecomposition of a matrix tells us many useful facts about the matrix. The
matrix is singular if and only if any of the eigenvalues are 0. The eigendecomposition
can also be used to optimize quadratic expressions of the form f(x) = x
Ax subject
to ||x||
2
= 1. Whenever x is equal to an eigenvector of A, f takes on the value of
the corresponding eigenvalue. The maximum value of f within the constraint region
is the maximum eigenvalue and its minimum value within the constraint region is the
minimum eigenvalue.
A matrix whose eigenvalues are all positive is called positive definite. A matrix whose
eigenvalues are all positive or zero-valued is called positive semidefinite. Likewise, if all
eigenvalues are negative, the matrix is negative definite, and if all eigenvalues are negative
or zero-valued, it is negative semidefinite. Positive semidefinite matrices are interesting
because they guarantee that ∀x, x
Ax ≥ 0. Positive definite matrices additionally
guarantee that x
Ax = 0 ⇒ x = 0.
2.8 Singular Value Decomposition
The singular value decomposition is a general purpose matrix factorization method
that decomposes an n × m matrix X into three distinct matrices: X = UΣW
,
where Σ is a rectangular diagonal matrix
3
, U is an n × n-dimensional matrix whose
columns are mutually orthonormal and similarly W is an m × m-dimensional matrix
whose columns are mutually orthonormal. The elements along the diagonal of Σ, σ
i
for i ∈ {1, . . . , min{n, m}}, are referred to as the singular values and the n columns
or u and the m columns of W are commonly referred to the left-singular vectors and
right-singular vectors of X respectively.
The singular value decomposition has many useful properties, however for our pur-
poses, we will concentrate on their relation to the eigen-decomposition. Specifically, the
left-singular vectors of X (the columns U
:,i
, for i ∈ {1, . . . , n}) are the eigenvectors of
XX
. The right-singular vectors of X (the columns W
:,j
, for j ∈ {1, . . . , m}) are the
eigenvectors of X
X. Finally, the non-zero singular values of X (σ
i
: for all i such that
σ
i
= 0) are the square roots of the non-zero eigenvalues for both XX
and X
X.
3
A rectangular n × m-dimensional diagonal matrix can be seen as a consisting of two sub-matrices,
one square diagonal matrix of size c × c where c = min{n, m} and a matrix of to fill out the rest of the
rectangular matrix.
30