for V a finite dimensional vector space over the field of
scalars F, there exists a direct sum decomposition
V = V1 ⊕ ⋅⋅⋅⊕ Vq
where
Vk = ker(ϕk (A)mk)
and ϕ_{k}
(λ)
is an irreducible polynomial. Here the minimal polynomial of A was
∏q mk
ϕk (λ)
k=1
Next I will consider the problem of finding a basis for V_{k} such that the matrix of A restricted
to V_{k} assumes various forms.
Definition 9.3.1Letting x≠0 denote by β_{x}the vectors
{x,Ax, A2x,⋅⋅⋅,Am −1x}
wherem is the smallest such that A^{m}x ∈span
(x,⋅⋅⋅,Am −1x)
. This is calledan A cyclic set.The vectors which result are also called a Krylov sequence.For such a sequence of vectors,
|β |
x
≡ m.
The first thing to notice is that such a Krylov sequence is always linearly independent.
Lemma 9.3.2Let β_{x} =
{ 2 m−1 }
x,Ax, A x,⋅⋅⋅,A x
,x≠0 where m is the smallest such thatA^{m}x ∈span
( m−1 )
x,⋅⋅⋅,A x
. Then
{ 2 m −1 }
x,Ax,A x,⋅⋅⋅,A x
is linearly independent.
Proof: Suppose that there are scalars a_{k}, not all zero such that
m∑−1 k
akA x = 0
k=0
Then letting a_{r} be the last nonzero scalar in the sum, you can divide by a_{r} and solve for A^{r}x as
a linear combination of the A^{j}x for j < r ≤ m − 1 contrary to the definition of m.
■
Now here is a nice lemma which has been pretty much discussed earlier.
Lemma 9.3.3Suppose W is a subspace of V where V is a finite dimensional vector spaceand L ∈ℒ
(V,V )
and suppose LW = LV. Then V = W + ker
(L)
.
Proof:Let a basis for LV = LW be
{Lw1,⋅⋅⋅,Lwm }
,w_{i}∈ W. Then let y ∈ V. Thus
Ly = ∑_{i=1}^{m}c_{i}Lw_{i} and so
For more on the next lemma and the following theorem, see Hofman and Kunze [15]. I am
following the presentation in Friedberg Insel and Spence [10]. See also Herstein [14] for a
different approach to canonical forms. To help organize the ideas in the lemma, here is a
diagram.
PICT
Lemma 9.3.4Let W be an A invariant
(AW ⊆ W )
subspace of ker
(ϕ(A )m )
for m apositive integer where ϕ
(λ)
is an irreducible monic polynomial of degree d. Let U be an Ainvariant subspace of ker
(ϕ (A ))
.
If
{v1,⋅⋅⋅,vs}
is a basis for W then if x ∈ U ∖ W,
{v1,⋅⋅⋅,vs,βx}
is linearly independent.
There exist vectors x_{1},
⋅⋅⋅
,x_{p}each in U such that
{ }
v1,⋅⋅⋅,vs,βx1,⋅⋅⋅,βxp
is a basis for
U +W.
Also, if x ∈ ker
m
(ϕ (A ) )
,
|βx|
= kd where k ≤ m. Here
|βx|
is the length of β_{x}, the degree ofthe monic polynomial η
(λ)
satisfying η
(A )
x = 0 with η
(λ)
having smallest possibledegree.
Proof: Claim:If x ∈ kerϕ
(A)
, and
|βx|
denotes the length of β_{x}, then
|βx|
= d the degree
of the irreducible polynomial ϕ(λ) and so
{ }
βx = x,Ax,A2x,⋅⋅⋅,Ad−1x
also span
(β )
x
is A invariant, A
(span(β ))
x
⊆span
(β )
x
.
Proof of the claim: Let m =
|β |
x
. That is, there exists monic η
(λ )
of degree m and
η
(A )
x = 0 with m is as small as possible for this to happen. Then from the usual process of
division of polynomials, there exist l
( ( d−1 )) ( ( d−1 ))
d = dim span z,Az,⋅⋅⋅,A z ≤ dim W ∩ span x,Ax,⋅⋅⋅,A x
( ( d−1 ))
≤ dim span x,Ax,⋅⋅⋅,A x = d
Thus
( d−1 ) ( d−1 )
W ∩ span x,Ax, ⋅⋅⋅,A x = span x,Ax,⋅⋅⋅,A x
which would require x ∈ W but this is assumed not to take place. Hence z = 0 and so the linear
independence of the
{v1,⋅⋅⋅,vs}
implies each a_{i} = 0. Then the linear independence
of
{ }
x,Ax,⋅⋅⋅,Ad−1x
, which follows from Lemma 9.3.2, shows each d_{j} = 0. Thus
{ }
v1,⋅⋅⋅,vs,x,Ax, ⋅⋅⋅,Ad−1x
is linearly independent as claimed.
Let x ∈ U ∖ W ⊆ ker
(ϕ(A ))
. Then it was just shown that
{v1,⋅⋅⋅,vs,βx}
is linearly
independent. Let W_{1} be given by
y ∈ span (v1,⋅⋅⋅,vs,βx) ≡ W1
Then W_{1} is A invariant. If W_{1} equals U + W, then you are done. If not, let W_{1} play the role of
W and pick x_{1}∈ U ∖ W_{1} and repeat the argument. Continue till
span (v1,⋅⋅⋅,vs,βx1,⋅⋅⋅,βxn) = U + W
The process stops because ker
(ϕ(A)m)
is finite dimensional.
Finally, letting x ∈ ker
(ϕ (A )m)
, there is a monic polynomial η
(λ)
such that η
(A)
x = 0 and
η
(λ)
is of smallest possible degree, which degree equals
With this preparation, here is the main result about a basis V where A ∈ℒ
(V,V )
and the
minimal polynomial for A is ϕ
(A )
^{m} for ϕ
(λ)
irreducible an irreducible monic polynomial. There
is a very interesting generalization of this theorem in [15] which pertains to the existence of
complementary subspaces. For an outline of this generalization, see Problem 9 on Page
853.
Theorem 9.3.5Suppose A ∈ℒ
(V,V)
for V some finite dimensional vector space. Thenfor each k ∈ ℕ,there exists a cyclic basis for ker
( )
ϕ(A)k
which is one of the formβ =
{ }
βx1,⋅⋅⋅,βxp
or ker
( k)
ϕ(A)
=
{0}
. Note that if ker
(ϕ (A ))
≠
{0}
, then the same istrue for all ker
( )
ϕ (A)k
,k ∈ ℕ.
Proof: If k = 1, you can use Lemma 9.3.4 and let W =
{0}
and U = ker
(ϕ (A ))
to obtain the
cyclic basis. Suppose then that the theorem is true for m − 1,m − 1 ≥ 1 meaning that for
any finite dimensional vector space V and A ∈ℒ
(V,V)
, ker
( )
ϕ(A)k
has a cyclic
basis for all k ≤ m − 1. Consider a new vector space ϕ