188 CHAPTER 11. MATRICES AND THE INNER PRODUCT
Proof: First note that IB = BI for any square matrix B. Next note that
0 = p(A) = (A−µ1I) · · ·(A−µmI) (11.2)
Also note that
(A−µI)(A−λ I) = A2− (λ +µ)A+µλ I = (A−λ I)(A−µI)
Thus all the factors in the above product 11.2 can be interchanged and thereby placed inany order in the product. We know that for any y,(
A−µ jI)[
(A−µ1I) · · ·(
A−µ j−1I)(
A−µ j+1I)· · ·(A−µmI)
]y = 0
However, there is some y j such that
(A−µ1I) · · ·(
A−µ j−1I)(
A−µ j+1I)· · ·(A−µmI)y j ̸= 0
since otherwise, p(λ ) didn’t really have smallest degree. Then let
x j = (A−µ1I) · · ·(
A−µ j−1I)(
A−µ j+1I)· · ·(A−µmI)y j ■
The minimum polynomial can be computed although it might seem a little tedious. Inthe above discussion, the minimum polynomial is only known to have degree no more thann2. Actually it can be shown that the degree of the minimum polynomial is never morethan n although it might be less than n. We will show this later as part of the theory of thedeterminant but in the meantime, one should go ahead and use it. Here is an example.
Example 11.1.5 Let
A =
(2 11 3
)Find its minimum polynomial.
The matrices are I,
(2 11 3
),
(2 11 3
)2
. These will end up being linearly depen-
dent. They are
(1 00 1
),
(2 11 3
),
(5 55 10
). The polynomial is obtained by find-
ing a linear combination of these equal to 0. Lets make these into column vectors and userow operations.
1 2 50 1 50 1 51 3 10
Now we row reduce this to get
1 0 −50 1 50 0 00 0 0