8.5. LINEAR INDEPENDENCE AND BASES 167

Now here is another proof. Suppose k < m. Then since {u1, · · · ,uk} is a basis for V,each vi is a linear combination of the vectors of {u1, · · · ,uk} . Consider the matrix(

u1 · · · uk v1 · · · vm

)in which each of the ui is a pivot column because the {u1, · · · ,uk} are linearly independent.Therefore, the row reduced echelon form of this matrix is(

e1 · · · ek w1 · · · wm

)(8.3)

where each w j has zeroes below the kth row. This is because of Lemma 8.2.5 which implieseach wi is a linear combination of the e1, · · · ,ek. Discarding the bottom n−k rows of zeroesin the above, yields the matrix(

e′1 · · · e′k w′1 · · · w′m)

in which all vectors are in Fk. Since m > k, it follows from Corollary 8.5.5 that the vectors,{w′1, · · · ,w′m} are dependent. Therefore, some w′j is a linear combination of the otherw′i. Therefore, w j is a linear combination of the other wi in 8.3. By Lemma 8.2.5 again,the same linear relationship exists between the {v1, · · · ,vm} showing that {v1, · · · ,vm} isnot linearly independent and contradicting the assumption that {v1, · · · ,vm} is a basis. Itfollows m≤ k. Similarly, k ≤ m. ■

This is a very important theorem so here is yet another proof of it.

Theorem 8.5.15 Let V be a subspace and suppose {u1, · · · ,uk} and {v1, · · · ,vm} are twobases for V . Then k = m.

Proof: Suppose k > m. Then since the vectors, {u1, · · · ,uk} span V, there exist scalars,ci j such that

m

∑i=1

ci jvi = u j.

Therefore,k

∑j=1

d ju j = 0 if and only ifk

∑j=1

m

∑i=1

ci jd jvi = 0

if and only ifm

∑i=1

(k

∑j=1

ci jd j

)vi = 0

Now since{v1, · · · ,vn} is independent, this happens if and only if

k

∑j=1

ci jd j = 0, i = 1,2, · · · ,m.

However, this is a system of m equations in k variables, d1, · · · ,dk and m < k. Therefore,there exists a solution to this system of equations in which not all the d j are equal to zero.

Recall why this is so. The augmented matrix for the system is of the form(

C 0)

where