12.2. SCHUR’S THEOREM 299

The constants, cii are the eigenvalues of L. Thus the matrix whose ijth entry is cij is uppertriangular.

Proof: If dim (H) = 1, let H = span (w) where |w| = 1. Then Lw = kw for some k.Then

L = kw ⊗w

because by definition, w ⊗w (w) = w. Therefore, the theorem holds if H is 1 dimensional.Now suppose the theorem holds for n− 1 = dim (H) . Let wn be an eigenvector for L∗.

Dividing by its length, it can be assumed |wn| = 1. Say L∗wn = µwn. Using the GramSchmidt process, there exists an orthonormal basis for H of the form {v1, · · · ,vn−1,wn} .Then

(Lvk,wn) = (vk, L∗wn) = (vk, µwn) = 0,

which showsL : H1 ≡ span (v1, · · · ,vn−1) → span (v1, · · · ,vn−1) .

Denote by L1 the restriction of L to H1. Since H1 has dimension n − 1, the inductionhypothesis yields an orthonormal basis, {w1, · · · ,wn−1} for H1 such that

L1 =

n−1∑j=1

j∑i=1

cijwi⊗wj . (12.3)

Then {w1, · · · ,wn} is an orthonormal basis for H because every vector in

span (v1, · · · ,vn−1)

has the property that its inner product with wn is 0 so in particular, this is true for thevectors {w1, · · · ,wn−1}. Now define cin to be the scalars satisfying

Lwn ≡n∑

i=1

cinwi (12.4)

and let

B ≡n∑

j=1

j∑i=1

cijwi⊗wj .

Then by 12.4,

Bwn =

n∑j=1

j∑i=1

cijwiδnj =

n∑j=1

cinwi = Lwn.

If 1 ≤ k ≤ n− 1,

Bwk =

n∑j=1

j∑i=1

cijwiδkj =

k∑i=1

cikwi

while from 12.3,

Lwk = L1wk =

n−1∑j=1

j∑i=1

cijwiδjk =

k∑i=1

cikwi.

Since L = B on the basis {w1, · · · ,wn} , it follows L = B.It remains to verify the constants, ckk are the eigenvalues of L, solutions of the equation,

det (λI − L) = 0. However, the definition of det (λI − L) is the same as

det (λI − C)

12.2. SCHUR’S THEOREM 299The constants, c; are the eigenvalues of L. Thus the matrix whose ij*” entry is Cig 18 Uppertriangular.Proof: If dim(H) = 1, let H = span(w) where |w| = 1. Then Zw = kw for some k.ThenL=kw®@wbecause by definition, w ® w (w) = w. Therefore, the theorem holds if H is 1 dimensional.Now suppose the theorem holds for n — 1 = dim(H). Let w,, be an eigenvector for L*.Dividing by its length, it can be assumed |w,| = 1. Say L*w, = pwr. Using the GramSchmidt process, there exists an orthonormal basis for H of the form {v1,--- ,Vn—1,Wn}.Then(LVR, Wn) = (vz, L* wn) = (Vi, Wn) = 0,which showsL: Hy, =span(vj,--+ ,Vn—1) > span (vi,°-+ ,Vn—1)-Denote by L, the restriction of L to H,. Since H; has dimension n — 1, the inductionhypothesis yields an orthonormal basis, {w1,--- ,Wn—1} for Hy such thatn-1 JjLi= S> S- Cig Wi DW. (12.3)j=1 i=lThen {wi,--- ,w,} is an orthonormal basis for H because every vector inspan (v1, ute »Vn-1)has the property that its inner product with w, is 0 so in particular, this is true for thevectors {wi,:-: ,Wn—-i}. Now define c;,, to be the scalars satisfyingnLw, = S- Cin Wi (12.4)and let ;ngB= S- S- cigwi@w;.j=l1lii=1Then by 12.4,n J= » > Cig Wid nj >> CinW;i = Lwy.Ifl<k<n-l,nj kBwr, => S- So cig Wid: => So cin wij=l i=l i=1while from 12.3,n-1 jLwy, = Iywr = = y s Cig Wid jk = » CikWi-j=l i=1Since L = B on the basis {wj,--- , wy}, it follows LD = B.It remains to verify the constants, cz, are the eigenvalues of L, solutions of the equation,det (AI — L) = 0. However, the definition of det (AI — L) is the same asdet (AI — C)