180 CHAPTER 8. NORMED LINEAR SPACES
Here is why. For p = 0, it is obvious because there are s vectors from {v1, · · · ,vs}which span V, namely those vectors. Of course there might be a smaller list which does so,and so
∣∣Ep∣∣≤ s. Suppose true for some p < s. Then
up+1 ∈ span(Fp,Ep)
and so there are constants, c1, · · · ,cp and d1, · · · ,dm where m≤ s− p such that
up+1 =p
∑i=1
ciui +m
∑j=1
diz j
for{z1, · · · ,zm} ⊆ {v1, · · · ,vs} .
Then not all the di can equal zero because this would violate the linear independence ofthe {u1, · · · ,ur} . Therefore, you can solve for one of the zk as a linear combination of{
u1, · · · ,up+1}
and the other z j. Thus you can change Fp to Fp+1 and include one fewervector in Ep. Thus
∣∣Ep+1∣∣≤ m−1≤ s− p−1. This proves the claim.
Therefore, Es is empty and span(u1, · · · ,us) = V. However, this gives a contradictionbecause it would require
us+1 ∈ span(u1, · · · ,us)
which violates the linear independence of these vectors.Alternate proof: Recall from linear algebra that if you have A an m×n matrix where
m < n so there are more columns than rows, then there exists a nonzero solution x to theequation Ax = 0. Recall why this was. You must have free variables. Then by assumption,you have
u j =s
∑i=1
ai jvi
If s < r, then the matrix (ai j) has more columns than rows and so there exists a nonzerovector x ∈ Fr such that ∑
rj=1 ai jx j = 0. Then consider the following.
r
∑j=1
x ju j =r
∑j=1
x j
s
∑i=1
ai jvi = ∑i
∑j
ai jx jvi = ∑i
0v j = 0
and since not all x j = 0, this contradicts the independence of the vectors {u1, · · · ,ur}.
Definition 8.2.4 A finite set of vectors, {x1, · · · ,xr} is a basis for a vector space V if
span(x1, · · · ,xr) =V
and {x1, · · · ,xr} is linearly independent. Thus if v∈V there exist unique scalars, v1, · · · ,vrsuch that v =∑
ri=1 vixi. These scalars are called the components of v with respect to the
basis {x1, · · · ,xr}.