422 CHAPTER 19. EIGENVALUES AND EIGENVECTORS

What does it mean to say that UTU = I which is the definition for U to be orthogonal?

This says that for U =(u1 · · · un

),UT =

 uT1...uT

n

 and so from the way we multiply

matrices in which the i jth entry of the product is the product of the ith row of the matrix onthe left with the jth column of the matrix on the right, we have uT

i u j = δ i j. In other words,the columns of U are orthonormal. From this simple observation, the following theorem isobtained.

Theorem 19.8.3 Let {u1, · · · ,un} be orthonormal. Then it is linearly independent.

Proof: We know from the above discussion that U =(u1 · · · un

)is orthogo-

nal. Thus if Ux= 0, you can multiply on the left on both sides with UT and obtainx = UTUx = UT0 = 0. Thus, from the definition of linear independence, Definition18.5.1, it follows that the columns of U comprise an independent set of vectors.

The proof of the following theorem is based on the Gram Schmidt process.

Theorem 19.8.4 Let {x1, · · · ,xn} be linearly independent in Rp, p≥ n. Then thereexist orthonormal vectors {u1, · · · ,un} which have the property that for each k ≤ n,span(x1, · · · ,xk) = span(u1, · · · ,uk) .

Proof: Let u1 ≡ x1/ |x1| . Thus for k = 1, span(u1) = span(x1) and {u1} is anorthonormal set. Now suppose for some k < n, u1, · · · , uk have been chosen such that(u j,ul) = δ jl and span(x1, · · · ,xk) = span(u1, · · · ,uk). Then define

uk+1 ≡xk+1 −∑

kj=1 (xk+1 ·u j)u j∣∣∣xk+1 −∑kj=1 (xk+1 ·u j)u j

∣∣∣ , (19.2)

where the denominator is non-zero because the sum is in the span of the {x1, · · · ,xk}. Thusby induction,

uk+1 ∈ span(u1, · · · ,uk,xk+1) = span(x1, · · · ,xk,xk+1) .

Also, xk+1 ∈ span(u1, · · · ,uk,uk+1) from solving 19.2 for xk+1, and it follows

span(x1, · · · ,xk,xk+1) = span(u1, · · · ,uk,uk+1) .

If l ≤ k,

(uk+1 ·ul) =C

((xk+1 ·ul)−

k

∑j=1

(xk+1 ·u j)(u j ·ul)

)=

C

((xk+1 ·ul)−

k

∑j=1

(xk+1 ·u j)δ l j

)=C ((xk+1 ·ul)− (xk+1 ·ul)) = 0.

The vectors,{u j}n

j=1 , generated in this way are therefore orthonormal because each vectorhas unit length.

422 CHAPTER 19. EIGENVALUES AND EIGENVECTORSWhat does it mean to say that U'U =I which is the definition for U to be orthogonal?uyThis says thatforU=( uj) +--+ tn, ),U'=] + | andso from the way we multiplyTUnmatrices in which the ij” entry of the product is the product of the i” row of the matrix onthe left with the jin column of the matrix on the right, we have ul Uj =6 ij- In other words,the columns of U are orthonormal. From this simple observation, the following theorem isobtained.Theorem 19.8.3 Lez {u1,-°++ ,Un} be orthonormal. Then it is linearly independent.Proof: We know from the above discussion that U = ( Uo Un ) is orthogo-nal. Thus if Ua =0, you can multiply on the left on both sides with U’ and obtaina =U'Ux =U'0=0. Thus, from the definition of linear independence, Definition18.5.1, it follows that the columns of U comprise an independent set of vectors. §fThe proof of the following theorem is based on the Gram Schmidt process.Theorem 19.8.4 Lez {x1,-+- , an} be linearly independent in R?, p >n. Then thereexist orthonormal vectors {u1,+-+ ,Un} which have the property that for each k <n,span (a1,---,@) = span (w,---, wy).Proof: Let uw; = x /|x;|. Thus for k = 1, span(w;) = span(a,) and {uw} is anorthonormal set. Now suppose for some k <n, u1, ---, ux have been chosen such that(uj, uy) = dj, and span (a1,--- ,#) = span (w1,--- , ux). Then definekLey — Vey (Wey Uy) UjUe = Ft — (19.2)LEHI —Liat (41 Uy) U;where the denominator is non-zero because the sum is in the span of the {a,--- ,a}. Thusby induction,Uc+1 € span (U1 ,-++ , Uk, Lep1) = span (@],-++ , Le, E41).Also, %41 € span (t1,°++ , Uz, Uz+1) from solving 19.2 for x41, and it followsspan (@1,-++ ,@x, R41) = span (Uy,-++ , Ux, Ue+1)-Ifl<k,kfor) <€(( @ey1- Uy) — Y) (ey1- Uy) (up) =j=lkC | (@x41- U1) hI Bx) Uj) 51; | =C((wep1 Wy) — (@e41-W)) = 0.The vectors, {u j ior , generated in this way are therefore orthonormal because each vectorhas unit length. §