4.2. SUBSPACES SPANS AND BASES 93
Proof: Suppose first that {x1, · · · ,xn} is linearly independent. If
xk = ∑j ̸=k
c jx j,
then 0 = 1xk +∑ j ̸=k (−c j)x j, a nontrivial linear combination, contrary to assumption.This shows that if the set is linearly independent, then none of the vectors is a linear com-bination of the others.
Now suppose no vector is a linear combination of the others. Is {x1, · · · ,xn} linearlyindependent? If it is not, there exist scalars, ci, not all zero such that ∑
ni=1 cixi = 0. Say
ck ̸= 0. Then you can solve for xk as xk = ∑ j ̸=k (−c j/ck)x j contrary to assumption. Thisproves the lemma. ■
The following is called the exchange theorem.
Theorem 4.2.3 If
span(u1, · · · ,ur)⊆ span(v1, · · · ,vs)≡V
and {u1, · · · ,ur} are linearly independent, then r ≤ s.
Proof: Suppose r > s. Let Fp denote the first p vectors in {u1, · · · ,ur}. Let F0 denotethe empty set. Let Ep denote a finite list of vectors of {v1, · · · ,vs} and let
∣∣Ep∣∣ denote the
number of vectors in the list. Note that, by assumption, span(F0,Es) =V . For 0≤ p≤ s, letEp have the property span(Fp,Ep) =V and
∣∣Ep∣∣ is as small as possible for this to happen.
If∣∣Ep∣∣ = 0, then span(Fp) = V which would imply that, since r > s ≥ p,ur ∈ span(Fs)
contradicting the linear independence of {u1, · · · ,ur}. Assume then that∣∣Ep∣∣ > 0. Then
up+1 ∈ span(Fp,Ep) and so there are constants, c1, · · · ,cp and d1, · · · ,dm such that up+1 =
∑pi=1 ciui +∑
mj=1 diz j for {z1, · · · ,zm} ⊆ {v1, · · · ,vs} . Then not all the di can equal zero
because this would violate the linear independence of the {u1, · · · ,ur} . Therefore, you cansolve for one of the zk as a linear combination of
{u1, · · · ,up+1
}and the other z j. Thus
you can change Fp to Fp+1 and include one fewer vector in Ep+1 with span(Fp+1,Ep+1)=Vand so
∣∣Ep+1∣∣< ∣∣Ep
∣∣ contrary to the claim that∣∣Ep∣∣was as small as possible. Thus
∣∣Ep∣∣= 0
after all and so a contradiction results.Alternate proof: Recall from linear algebra that if you have A an m×n matrix where
m < n so there are more columns than rows, then there exists a nonzero solution x to theequation Ax= 0. Recall why this was. You must have free variables. Then by assumption,you have u j = ∑
si=1 ai jvi. If s < r, then the matrix (ai j) has more columns than rows and so
there exists a nonzero vector x ∈ Fr such that ∑rj=1 ai jx j = 0. Then consider the following.
r
∑j=1
x ju j =r
∑j=1
x j
s
∑i=1
ai jvi = ∑i
∑j
ai jx jvi = ∑i
0v j = 0
and since not all x j = 0, this contradicts the independence of {u1, · · · ,ur}. ■
Definition 4.2.4 A finite set of vectors, {x1, · · · ,xr} is a basis for a vector space Vif
span(x1, · · · ,xr) =V
and {x1, · · · ,xr} is linearly independent. Thus if v ∈V there exist unique scalars, v1, · · · ,vrsuch that v = ∑
ri=1 vixi. These scalars are called the components of v with respect to the
basis {x1, · · · ,xr} and {x1, · · · ,xr} are said to “span” V .