4.4 Subspaces Spans And Bases
Definition 4.4.1 Let
be vectors in Fn. A linear combination is any
expression of the form
where the ci are scalars. The set of all linear combinations of these vectors is called
. If V ⊆ Fn, then V is called a subspace if whenever α,β are scalars
and u and v are vectors of V, it follows αu
+ βv ∈ V . That is, it is “closed
under the algebraic operations of vector addition and scalar multiplication”. A
linear combination of vectors is said to be trivial if all the scalars in the linear
combination equal zero. A set of vectors is said to be linearly independent if
the only linear combination of these vectors which equals the zero vector is the
trivial linear combination. Thus
is called linearly independent if
it follows that all the scalars, ck equal zero. A set of vectors,
, is called
linearly dependent if it is not linearly independent. Thus the set of vectors is
linearly dependent if there exist scalars, ci,i
,n, not all zero such that
Lemma 4.4.2 A set of vectors
is linearly independent if and only if
none of the vectors can be obtained as a linear combination of the others.
Proof: Suppose first that
is linearly independent. If
a nontrivial linear combination, contrary to assumption. This shows that if the set is
linearly independent, then none of the vectors is a linear combination of the
Now suppose no vector is a linear combination of the others. Is
independent? If it is not there exist scalars,
not all zero such that
Say ck≠0. Then you can solve for xk as
contrary to assumption. This proves the lemma.
The following is called the exchange theorem.
Theorem 4.4.3 (Exchange Theorem) Let
be a linearly independent
set of vectors such that each xi is in span
. Then r ≤ s.
Proof: Define span
it follows there exist scalars, c1,
Not all of these scalars can equal zero because if this were the case, it would follow that
x1 = 0 and so
would not be linearly independent. Indeed, if
and so there would exist a nontrivial linear combination of the
which equals zero.
Say ck≠0. Then solve (4.4.16) for yk and obtain
because if v ∈ V,
there exist constants c1,
Now replace the yk in the above with a linear combination of the vectors,
The vector yk, in the list
has now been replaced with the vector x1
resulting modified list of vectors has the same span as the original list of vectors,
Now suppose that r > s and that
where the vectors, z1,
are each taken from the set,
This has now been done for l
= 1 above. Then since r > s,
it follows that
l ≤ s < r
and so l
+ 1 ≤ r.
is a vector not in the list,
there exist scalars, ci
Now not all the dj can equal zero because if this were so, it would follow that
would be a linearly dependent set because one of the vectors would equal a
linear combination of the others. Therefore, (
) can be solved for one of the zi,
in terms of xl+1
and the other zi
and just as in the above argument, replace that zi
Continue this way, eventually obtaining
But then xr ∈ span
contrary to the assumption that
r ≤ s
Definition 4.4.4 A finite set of vectors,
is a basis for Fn if
is linearly independent.
Corollary 4.4.5 Let
be two bases
of Fn. Then r
Proof: From the exchange theorem, r ≤ s and s ≤ r. Now note the vectors,
for i = 1,2,
are a basis for Fn
. This proves the corollary.
Lemma 4.4.6 Let
be a set of vectors. Then V ≡ span
is a subspace.
Proof: Suppose α,β are two scalars and let ∑
k=1rckvk and ∑
k=1rdkvk are two
elements of V. What about
Is it also in V ?
so the answer is yes. This proves the lemma.
Definition 4.4.7 A finite set of vectors,
is a basis for a subspace, V
of Fn if span
is linearly independent.
Corollary 4.4.8 Let
be two bases for V . Then
Proof: From the exchange theorem, r ≤ s and s ≤ r. Therefore, this proves the
Definition 4.4.9 Let V be a subspace of Fn. Then dim
read as the dimension
of V is the number of vectors in a basis.
Of course you should wonder right now whether an arbitrary subspace even has a
basis. In fact it does and this is in the next theorem. First, here is an interesting
Lemma 4.4.10 Suppose v
is also linearly independent.
Proof: Suppose ∑
i=1kciui + dv = 0. It is required to verify that each ci = 0 and
that d = 0. But if d≠0, then you can solve for v as a linear combination of the vectors,
contrary to assumption. Therefore, d = 0. But then ∑
i=1kciui = 0 and the
linear independence of
= 0 also. This proves the
Theorem 4.4.11 Let V be a nonzero subspace of Fn. Then V has a basis.
Proof: Let v1 ∈ V where v1≠0. If span
is a basis for
Otherwise, there exists v2 ∈ V
which is not in span
By Lemma 4.4.10
linearly independent set of vectors. If
is a basis for
then there exists v3
larger linearly independent set of vectors. Continuing this way, the process must
+ 1 steps because if not, it would be possible to obtain n
linearly independent vectors contrary to the exchange theorem. This proves the
In words the following corollary states that any linearly independent set of vectors can
be enlarged to form a basis.
Corollary 4.4.12 Let V be a subspace of Fn and let
be a linearly
independent set of vectors in V . Then either it is a basis for V or there exist
,vs such that
is a basis for V.
Proof: This follows immediately from the proof of Theorem 58.16.4. You
do exactly the same argument except you start with
It is also true that any spanning set of vectors can be restricted to obtain a
Theorem 4.4.13 Let V be a subspace of Fn and suppose span
where the ui are nonzero vectors. Then there exist vectors,
is a basis for V .
Proof: Let r be the smallest positive integer with the property that for some set,
Then r ≤ p and it must be the case that
is linearly independent because if it
were not so, one of the vectors, say
would be a linear combination of the others. But
then you could delete this vector from
and the resulting list of
1 vectors would still span V
contrary to the definition of r
. This proves the