420 CHAPTER 19. EIGENVALUES AND EIGENVECTORS

of some important theorems. Recall the inner product or dot product.

a ·b≡ ∑k

akbk

In more advanced contexts, this is usually written as ⟨a,b⟩ or often simply as (a,b) insteadof a ·b. Also, the term “inner product” tends to be preferred over “dot product”. I willsometimes use the notation (a,b) instead of a ·b because of this. First is an importantrelationship between the inner product and the transpose.

Proposition 19.6.1 Suppose a,b are vectors in Rn and Rm respectively and let A bean m×n matrix. Then (Aa,b) =

(a,ATb

).

Proof: From the definition of the inner product,

(Aa,b) ≡ ∑i(Aa)i bi = ∑

i∑

jAi ja jbi = ∑

j∑

iAi ja jbi

= ∑j∑

iAT

jibia j = ∑j

(ATb

)j a j =

(a,ATb

)In words, the above says that when you take the A across the dot or comma you put atranspose on it and everything works just fine.

There are other more elegant ways to discuss eigenvectors and eigenvalues. See mybook on linear algebra and analysis to see a presentation which is independent of deter-minants. However, this is a book on calculus, not linear algebra and the determinant isimportant in other contexts. Also, from the point of view of history, the determinant cameearlier than the other linear algebra concepts.

19.7 Distance and Orthogonal MatricesSome matrices preserve lengths of vectors. That is |Ux| = |x| for any x in Rn. Sucha matrix is called orthogonal. Actually, this is not the standard definition. The standarddefinition is given next. First recall that if you have two square matrices of the same sizeand one acts like the inverse of the other on one side, then it will act like the inverse on theother side as well. See, for example, the discussion after Theorem 18.5.12. The traditionaldefinition of orthogonal is as follows.

Definition 19.7.1 Let U be a real n× n matrix. Then U is called orthogonal ifUTU =UUT = I.

Then the following proposition relates this to preservation of lengths of vectors.

Proposition 19.7.2 An n× n matrix U is orthogonal if and only if |Ux| = |x| for allvectors x.

Proof: First suppose the matrix U preserves all lengths. Since U preserves distances,|Uu| = |u| for every u. Let u,v be arbitrary vectors in Rn and let θ ∈ R, |θ | = 1, andθ(UTUu−u,v

)=∣∣(UTUu−u,v

)∣∣. Therefore from the axioms of the inner productand Proposition 19.7.2,

|u|2 + |v|2 +2θ (u,v) = |θu|2 + |v|2 +θ (u,v)+θ (v,u)

= |θu+v|2 = (U (θu+v) ,U (θu+v))

420 CHAPTER 19. EIGENVALUES AND EIGENVECTORSof some important theorems. Recall the inner product or dot product.a- b=) abykIn more advanced contexts, this is usually written as (a,b) or often simply as (a, b) insteadof a-b. Also, the term “inner product” tends to be preferred over “dot product’. I willsometimes use the notation (a,b) instead of a-b because of this. First is an importantrelationship between the inner product and the transpose.Proposition 19.6.1 Suppose a,b are vectors in R" and R" respectively and let A bean m x n matrix. Then (Aa,b) = (a,A"b).Proof: From the definition of the inner product,(Aa, b) = y? (Aa) bj = VY Aijajbi = VY Aijajbii ij ji= Py Ajbia;s =) (A") ,a; = (a,A7) Ofji jIn words, the above says that when you take the A across the dot or comma you put atranspose on it and everything works just fine.There are other more elegant ways to discuss eigenvectors and eigenvalues. See mybook on linear algebra and analysis to see a presentation which is independent of deter-minants. However, this is a book on calculus, not linear algebra and the determinant isimportant in other contexts. Also, from the point of view of history, the determinant cameearlier than the other linear algebra concepts.19.7 Distance and Orthogonal MatricesSome matrices preserve lengths of vectors. That is |Ux| = |x| for any x in R”. Sucha matrix is called orthogonal. Actually, this is not the standard definition. The standarddefinition is given next. First recall that if you have two square matrices of the same sizeand one acts like the inverse of the other on one side, then it will act like the inverse on theother side as well. See, for example, the discussion after Theorem 18.5.12. The traditionaldefinition of orthogonal is as follows.Definition 19.7.1 Let U be a real nxn matrix. Then U is called orthogonal ifU'U =UU' = 1.Then the following proposition relates this to preservation of lengths of vectors.Proposition 19.7.2 Ann xn matrix U is orthogonal if and only if |Ux| = |x| for allvectors xX.Proof: First suppose the matrix U preserves all lengths. Since U preserves distances,|Uu| = |u| for every u. Let u,v be arbitrary vectors in R” and let @ € R, |6| = 1, and6 (U'Uu—u,v) = |(U'Uu—u,v)|. Therefore from the axioms of the inner productand Proposition 19.7.2,|Ou|? + {vl +0 (u,v) +0 (v,u)= |Ou+v| =(U(Ou+v),U(@u+v))Jee|? + \v|* +20 (u,v)