
CHAPTER 2. LINEAR ALGEBRA
in each of these directions, i.e. x
i
specifies how far to move in the direction of
column i:
Ax =
X
i
x
i
A
:,i
.
In general, this kind of operation is called a linear combination. Formally, a linear
combination of some set of vectors {v
(1)
, . . . , v
(n)
} is given by multiplying each
vector v
(i)
by a corresponding scalar coefficient and adding the results:
X
i
c
i
v
(i)
.
The span of a set of vectors is the set of all points obtainable by linear combination
of the original vectors.
Determining whether Ax = b has a solution thus amounts to testing whether
b is in the span of the columns of A. This particular span is known as the column
space or the range of A.
In order for the system Ax = b to have a solution for all values of b ∈ R
m
,
we therefore require that the column space of A be all of R
m
. If any point in R
m
is excluded from the column space, that point is a potential value of b that has
no solution. This implies immediately that A must have at least m columns, i.e.,
n ≥ m. Otherwise, the dimensionality of the column space must be less than m.
For example, consider a 3 × 2 matrix. The target b is 3-D, but x is only 2-D, so
modifying the value of x at best allows us to trace out a 2-D plane within R
3
.
The equation has a solution if and only if b lies on that plane.
Having n ≥ m is only a necessary condition for every point to have a solution.
It is not a sufficient condition, because it is possible for some of the columns to
be redundant. Consider a 2 × 2 matrix where both of the columns are equal to
each other. This has the same column space as a 2 × 1 matrix containing only
one copy of the replicated column. In other words, the column space is still just
a line, and fails to encompass all of R
2
, even though there are two columns.
Formally, this kind of redundancy is known as linear dependence. A set of
vectors is linearly independent if no vector in the set is a linear combination of
the other vectors. If we add a vector to a set that is a linear combination of the
other vectors in the set, the new vector does not add any points to the set’s span.
This means that for the column space of the matrix to encompass all of R
m
, the
matrix must have at least m linearly independent columns. This condition is both
necessary and sufficient for equation 2.1 to have a solution for every value of b.
In order for the matrix to have an inverse, we additionally need to ensure that
equation 2.1 has at most one solution for each value of b. To do so, we need to
ensure that the matrix has at most m columns. Otherwise there is more than one
way of parametrizing each solution.
33