222 CHAPTER 8. LINEAR TRANSFORMATIONS

Definition 8.3.11 An n× n matrix A, is diagonalizable if there exists an invertible n× nmatrix S such that S−1AS = D, where D is a diagonal matrix. Thus D has zero entrieseverywhere except on the main diagonal. Write diag (λ1 · · · , λn) to denote the diagonalmatrix having the λi down the main diagonal.

The following theorem is of great significance.

Theorem 8.3.12 Let A be an n×n matrix. Then A is diagonalizable if and only if Fn hasa basis of eigenvectors of A. In this case, S of Definition 8.3.11 consists of the n×n matrixwhose columns are the eigenvectors of A and D = diag (λ1, · · · , λn) .

Proof: Suppose first that Fn has a basis of eigenvectors, {v1, · · · ,vn} where Avi = λivi.

Then let S denote the matrix(

v1 · · · vn

)and let S−1 ≡

uT1

...

uTn

 where

uTi vj = δij ≡

{1 if i = j

0 if i ̸= j.

S−1 exists because S has rank n. Then from block multiplication,

S−1AS =

uT1

...

uTn

 (Av1 · · ·Avn) =

uT1

...

uTn

 (λ1v1 · · ·λnvn)

=

λ1 0 · · · 0

0 λ2 0 · · ·...

. . .. . .

. . .

0 · · · 0 λn

 = D.

Next suppose A is diagonalizable so S−1AS = D ≡ diag (λ1, · · · , λn) . Then the columnsof S form a basis because S−1 is given to exist. It only remains to verify that these

columns of S are eigenvectors. But letting S =(

v1 · · · vn

), AS = SD and so(

Av1 · · · Avn

)=(λ1v1 · · · λnvn

)which shows that Avi = λivi. ■

It makes sense to speak of the determinant of a linear transformation as described in thefollowing corollary.

Corollary 8.3.13 Let L ∈ L (V, V ) where V is an n dimensional vector space and let A bethe matrix of this linear transformation with respect to a basis on V. Then it is possible todefine

det (L) ≡ det (A) .

Proof: Each choice of basis for V determines a matrix for L with respect to the basis.If A and B are two such matrices, it follows from Theorem 8.3.9 that

A = S−1BS

and sodet (A) = det

(S−1

)det (B) det (S) .

222 CHAPTER 8. LINEAR TRANSFORMATIONSDefinition 8.3.11 Ann xn matrix A, is diagonalizable if there exists an invertible n x nmatrix S such that S~'AS = D, where D is a diagonal matrix. Thus D has zero entrieseverywhere except on the main diagonal. Write diag (A1---,An) to denote the diagonalmatrix having the A; down the main diagonal.The following theorem is of great significance.Theorem 8.3.12 Let A be ann xn matrix. Then A is diagonalizable if and only if F” hasa basis of eigenvectors of A. In this case, S' of Definition 8.3.11 consists of then x n matrixwhose columns are the eigenvectors of A and D = diag (Ai,°-: ,An)-Proof: Suppose first that F” has a basis of eigenvectors, {vi,--- , Vn} where Av; = \;Vvj.urThen let S denote the matrix ( Vict Vn ) and let S-! = ; whereUn,lifia;uj Vj = bi; =OifiFgS~1 exists because S has rank n. Then from block multiplication,ur uyS-1AS = : (Av, -+- Avan) = : (Arvi +++ AnVn)uz usA, «OO 0O Ag O= . = D.0 0 AnNext suppose A is diagonalizable so S~'AS = D = diag (A1,+-- , An). Then the columnsof S form a basis because S~! is given to exist. It only remains to verify that thesecolumns of S' are eigenvectors. But letting S = ( Vioctt Vn ). AS = SD and so( Av; «+: Avn ) = ( Avi +++) AnVn ) which shows that Av; = ;v;. HfIt makes sense to speak of the determinant of a linear transformation as described in thefollowing corollary.Corollary 8.3.13 Let L € £L(V,V) where V is ann dimensional vector space and let A bethe matrix of this linear transformation with respect to a basis on V. Then it is possible todefinedet (L) = det (A).Proof: Each choice of basis for V determines a matrix for ZL with respect to the basis.If A and B are two such matrices, it follows from Theorem 8.3.9 thatA=S"'!BSand sodet (A) = det (S~*) det (B) det (S).