456 CHAPTER 18. LINEAR TRANSFORMATIONS

each determining a solution as described above. Letting xk (t) denote the solution whichcomes from the chain (v1,v2, · · · ,vk) and the above formula involving a sum, it followsthat xk (0) = vk. Thus if you consider the solutions coming from the chains

v1,(v1,v2) , · · · ,(v1,v2, · · · ,vm)

and consider the vectors obtained by letting t = 0, this results in the ordered list of vectors

v1,v2, · · · ,vm

This is a linearly independent set of vectors. Suppose for some l ≤ m

l

∑k=1

ckvk = 0

where not all the ck = 0 and l is as small as possible for this to occur. Then since v1 ̸= 0, itmust be that l ≥ 2. Also, cl ̸= 0. Do A−λ I to both sides. This gives

0 =l

∑k=2

ckvk−1 =l−1

∑k=1

ck+1vk

and so l was not as small as possible after all. Thus the set must be linearly independentafter all.

Note that for C (λ k) , a chain based on an eigenvector corresponding to λ k,

A : span(C (λ k))→ span(C (λ k))

Letting Ak be the linear transformation which is the restriction of A to span(C (λ k)) , whatis the matrix of Ak with respect to the ordered basis

(v1,v2, · · · ,vm) = C (λ k)?

(A−λ kI)v j = v j−1, j > 1

while (A−λ kI)v1 = 0. Then formally, the matrix of Ak is given by M where(λ kv1 v1 +λ kv2 · · · vm−1 +λ kvm

)=(

v1 v2 · · · vm

)M

It follows that M is of the form λ k 1

λ k. . .. . . 1

λ k

 ,

a matrix which has all zeros except for λ k down the main diagonal and 1 down the superdiagonal. This is called a Jordan block corresponding to the eigenvalue λ k.

It can be proved that there are chains {C (λ k)}rk=1 of such vectors associated with

each eigenvalue λ k such that the totality of these vectors form a basis for Cn. Then you