Gram-Schmidt Orthogonalization Next  |  Prev  |  Up  |  Top  |  Index  |  JOS Index  |  JOS Pubs  |  JOS Home  |  Search


Gram-Schmidt Orthogonalization

Recall from the end of §5.10 above that an orthonormal set of vectors is a set of unit-length vectors that are mutually orthogonal. In other words, orthonormal vector set is just an orthogonal vector set in which each vector $ \sv_i$ has been normalized to unit length $ \sv_i/ \vert\vert\,\sv_i\,\vert\vert $.



Theorem: Given a set of $ N$ linearly independent vectors $ \sv_0,\ldots,\sv_{N-1}$ from $ {\bf C}^N$, we can construct an orthonormal set $ \underline{\tilde{s}}_0,\ldots,\underline{\tilde{s}}_{N-1}$ which are linear combinations of the original set and which span the same space.



Proof: We prove the theorem by constructing the desired orthonormal set $ \{\underline{\tilde{s}}_k\}$ sequentially from the original set $ \{\sv_k\}$. This procedure is known as Gram-Schmidt orthogonalization.

First, note that $ \sv_k\ne \underline{0}$ for all $ k$, since $ \underline{0}$ is linearly dependent on every vector. Therefore, $ \vert\vert\,\sv_k\,\vert\vert \ne
0$.

The Gram-Schmidt orthogonalization procedure will construct an orthonormal basis from any set of $ N$ linearly independent vectors. Obviously, by skipping the normalization step, we could also form simply an orthogonal basis. The key ingredient of this procedure is that each new basis vector is obtained by subtracting out the projection of the next linearly independent vector onto the vectors accepted so far into the set. We may say that each new linearly independent vector $ \sv_k$ is projected onto the subspace spanned by the vectors $ \{\underline{\tilde{s}}_0,\ldots,\underline{\tilde{s}}_{k-1}\}$, and any nonzero projection in that subspace is subtracted out of $ \sv_k$ to make the new vector orthogonal to the entire subspace. In other words, we retain only that portion of each new vector $ \sv_k$ which ``points along'' a new dimension. The first direction is arbitrary and is determined by whatever vector we choose first ($ \sv_0$ here). The next vector is forced to be orthogonal to the first. The second is forced to be orthogonal to the first two (and thus to the 2D subspace spanned by them), and so on.

This chapter can be considered an introduction to some important concepts of linear algebra. The student is invited to pursue further reading in any textbook on linear algebra, such as [46].5.13

Matlab/Octave examples related to this chapter appear in Appendix I.


Next  |  Prev  |  Up  |  Top  |  Index  |  JOS Index  |  JOS Pubs  |  JOS Home  |  Search

[How to cite this work] [Order a printed hardcopy]

``Mathematics of the Discrete Fourier Transform (DFT), with Music and Audio Applications'', by Julius O. Smith III, W3K Publishing, 2003, ISBN 0-9745607-0-7.
Copyright © 2007-02-02 by Julius O. Smith III
Center for Computer Research in Music and Acoustics (CCRMA),   Stanford University
CCRMA  [Automatic-links disclaimer]