400 CHAPTER 18. LINEAR FUNCTIONS
18.5 Linear IndependenceNow here is a very fundamental definition.
Definition 18.5.1 Let {u1, · · · ,ur} be vectors in Fp. They are independent if andonly if the only solution to the system of equations(
u1 · · · ur)x= 0
is x= 0. In other words the vectors are independent means that whenever
r
∑i=1
xiui = 0
it follows that each xi = 0. The set of vectors is dependent if it is not independent.
Note that any list of vectors containing the zero vector is automatically linearly depen-dent. Indeed, you could multiply this vector by 1 and all the others by 0. Then adding thesetogether, you would have a linear combination of the vectors in your list which equals 0although not all of the scalars used are 0. There is a fundamental result in the case wherem < n. In this case, the matrix A of the linear transformation looks like the following.
Theorem 18.5.2 Let A be an m × n matrix where m < n. Then N (A) containsnonzero vectors.
Proof: Since the matrix has more columns than rows, you can have at most m pivotcolumns. Without loss of generality, A has a nonzero column. Pick the first non-pivotcolumn of A called ar. Then such that ar = ∑
r−1i=1 ciai. Therefore,
0 =r−1
∑i=1
ciai −ar =(a1 · · · ar−1 ar · · · an
)
c1...
cr−1−1
...0
Note that the same conclusion occurs more generally if there is some column which is
not a pivot column even in case m ≥ n.With this preparation, here is a major theorem.
Theorem 18.5.3 Suppose you have vectors {u1, · · · ,ur} and that this set of vectorsis independent. Suppose also that there are vectors {v1, · · · ,vs} and that each u j is a linearcombination of the vectors {v1, · · · ,vs} . Then r ≤ s. A little less precisely, spanning setsare at least as long as linearly independent sets.