
CHAPTER 2. LINEAR ALGEBRA
Determining whether Ax = b has a solution thus amounts to testing whether
b is in the span of the columns of A. This particular span is known as the column
space or the range of A.
In order for the system Ax = b to have a solution for all values of b ∈ R
m
,
we therefore require that the column space of A be all of R
m
. If any point in R
m
is excluded from the column space, that point is a potential value of b that has
no solution. This implies immediately that A must have at least m columns, i.e.,
n ≥ m. Otherwise, the dimensionality of the column space must be less than m.
For example, consider a 3 × 2 matrix. The target b is 3-D, but x is only 2-D, so
modifying the value of x at best allows us to trace out a 2-D plane within R
3
.
The equation has a solution if and only if b lies on that plane.
Having n ≥ m is only a necessary condition for every point to have a solution.
It is not a sufficient condition, because it is possible for some of the columns to
be redundant. Consider a 2 × 2 matrix where both of the columns are equal to
each other. This has the same column space as a 2 × 1 matrix containing only
one copy of the replicated column. In other words, the column space is still just
a line, and fails to encompass all of R
2
, even though there are two columns.
Formally, this kind of redundancy is known as linear dependence. A set of
vectors is linearly independent if no vector in the set is a linear combination of
the other vectors. If we add a vector to a set that is a linear combination of the
other vectors in the set, the new vector does not add any points to the set’s span.
This means that for the column space of the matrix to encompass all of R
m
, the
matrix must have at least m linearly independent columns. This condition is both
necessary and sufficient for equation 2.1 to have a solution for every value of b.
In order for the matrix to have an inverse, we additionally need to ensure that
equation 2.1 has at most one solution for each value of b. To do so, we need to
ensure that the matrix has at most m columns. Otherwise there is more than one
way of parametrizing each solution.
Together, this means that the matrix must be square, that is, we require that
m = n and that all of the columns must be linearly independent. A square matrix
with linearly dependent columns is known as singular.
If A is not square or is square but singular, it can still be possible to solve the
equation. However, we can not use the method of matrix inversion to find the
solution.
So far we have discussed matrix inverses as being multiplied on the left. It is
also possible to define an inverse that is multiplied on the right:
AA
−1
= I.
For square matrices, the left inverse and right inverse are equal.
34