16.3. LINEAR INDEPENDENCE AND BASES 377

The next theorem is called the exchange theorem. It is very important that you under-stand this theorem. There are two kinds of people who go further in linear algebra, thosewho understand this theorem and its corollary presented later and those who don’t. Thosewho do understand these theorems are able to proceed and learn more linear algebra whilethose who don’t are doomed to wander in the wilderness of confusion and sink into theswamp of despair. Therefore, I am giving multiple proofs. Try to understand at least one ofthem. Several amount to the same thing, just worded differently. Before giving the proof,here is some important notation.

Notation 16.3.4 Let wi j ∈V, a vector space and let 1≤ i≤ r while 1≤ j ≤ s. Thus thesevectors can be listed in a rectangular array.

w11 w12 · · · w1s

w21 w22 · · · w2s...

......

wr1 wr2 · · · wrs

Then ∑sj=1 ∑

ri=1 wi j means to sum the vectors in each column and then to add the s sums

which result while ∑ri=1 ∑

sj=1 wi j means to sum the vectors in each row and then to add the

r sums which result. Either way you simply get the sum of all the vectors in the above array.This is because, from the vector space axioms, you can add vectors in any order and youget the same answer.

Theorem 16.3.5 Let {x1, · · · ,xr} be a linearly independent set of vectors such that each xiis in the span{y1, · · · ,ys} . Then r ≤ s.

Proof 1: Let

xk =s

∑j=1

a jky j

If r > s, then the matrix A =(a jk)

has more columns than rows. By Corollary 8.2.8 oneof these columns is a linear combination of the others. This implies there exist scalarsc1, · · · ,cr not all zero such that

r

∑k=1

a jkck = 0, j = 1, · · · ,r

Then

r

∑k=1

ckxk =r

∑k=1

ck

s

∑j=1

a jky j =s

∑j=1

=0︷ ︸︸ ︷

r

∑k=1

cka jk

y j = 0

which contradicts the assumption that {x1, · · · ,xr} is linearly independent. Hence r≤ s. ■Proof 2: Define span(y1, · · · ,ys)≡V, it follows there exist scalars, c1, · · · ,cs such that

x1 =s

∑i=1

ciyi. (16.5)