418 CHAPTER 19. EIGENVALUES AND EIGENVECTORS

where Ai is obtained from A by replacing the ith column of A with the column

(y1, · · · ,yn)T .

Find x  1 2 13 2 12 −3 2

 xyz

=

 123

 .

From Cramer’s rule,

x = det

 1 2 12 2 13 −3 2

/det

 1 2 13 2 12 −3 2

=12

To find y,z you do something similar replacing the y or z column with the right hand side.

19.4.2 Finding Eigenvalues Using Determinants

Theorem 19.4.1 says that A−1 exists if and only if det(A) ̸= 0 when there is even a for-mula for the inverse. Recall also that an eigenvector for λ is a nonzero vector x such thatAx = λx where λ is called an eigenvalue. Thus you have (A−λ I)x= 0 for x ̸= 0. If(A−λ I)−1 were to exist, then you could multiply by it on the left and obtain x= 0 afterall. Therefore, it must be the case that det(A−λ I) = 0. This yields a polynomial of de-gree n equal to 0. This polynomial is called the characteristic polynomial. For example,consider  1 −1 −1

0 3 20 −1 0

You need to have

det

 1 −1 −10 3 20 −1 0

−λ

 1 0 00 1 00 0 1

= 0

That on the left equals a polynomial of degree 3 which when factored yields

(1−λ )(λ −1)(λ −2)

Therefore, the possible eigenvalues are 1,1,2. Note how the 1 is listed twice. This is becauseit occurs twice as a root of the characteristic polynomial. Also, if M−1 does not exist whereM is an n×n matrix, then this means that the columns of M cannot be linearly independentsince if they were, then by Theorem 18.5.12 M−1 would exist. Thus if A−λ I fails to havean inverse as above, then the columns are not independent and so there exists a nonzero xsuch that (A−λ I)x= 0. Thus we have the following proposition.

Proposition 19.4.4 The eigenvalues of an n×n matrix are the roots of

det(A−λ I) = 0.

Corresponding to each of these λ is an eigenvector. Every n× n matrix for n ≥ 1 haseigenvectors and eigenvalues in Cn.