300 CHAPTER 11. LIMITS OF VECTORS AND MATRICES
are each eigenvalues of(
Aλ 0
)which has only finitely many and hence this sequence must
repeat. Therefore,(
λ
λ 0
)is a root of unity as claimed. This proves the theorem in the case
that v >> 0.Now it is necessary to consider the case where v > 0 but it is not the case that v >> 0.
Then in this case, there exists a permutation matrix P such that
Pv =
v1...
vr
0...0
≡
(u
0
)≡ v1
Then λ 0v = ATv = AT Pv1. Therefore, λ 0v1 = PAT Pv1 = Gv1. Now P2 = I because it isa permutation matrix. Therefore, the matrix G ≡ PAT P and A are similar. Consequently,they have the same eigenvalues and it suffices from now on to consider the matrix G ratherthan A. Then
λ 0
(u
0
)=
(M1 M2
M3 M4
)(u
0
)where M1 is r× r and M4 is (n− r)× (n− r) . It follows from block multiplication and theassumption that A and hence G are > 0 that
G =
(A′ B0 C
).
Now let λ be an eigenvalue of G such that |λ | = λ 0. Then from Lemma 11.4.8, eitherλ ∈ σ (A′) or λ ∈ σ (C) . Suppose without loss of generality that λ ∈ σ (A′) . Since A′ > 0it has a largest positive eigenvalue λ
′0 which is obtained from 11.11. Thus λ
′0 ≤ λ 0 but λ
being an eigenvalue of A′, has its absolute value bounded by λ′0 and so λ 0 = |λ | ≤ λ
′0 ≤ λ 0
showing that λ 0 ∈ σ (A′) . Now if there exists v >> 0 such that A′Tv = λ 0v, then the firstpart of this proof applies to the matrix A and so (λ/λ 0) is a root of unity. If such a vector,v does not exist, then let A′ play the role of A in the above argument and reduce to theconsideration of
G′ ≡
(A′′ B′
0 C′
)where G′ is similar to A′ and λ ,λ 0 ∈ σ (A′′) . Stop if A′′Tv = λ 0v for some v >> 0.Otherwise, decompose A′′ similar to the above and add another prime. Continuing this wayyou must eventually obtain the situation where (A′···′)T v = λ 0v for some v >> 0. Indeed,this happens no later than when A′···′ is a 1×1 matrix. ■
11.5 Functions Of MatricesThe existence of the Jordan form also makes it possible to define various functions ofmatrices. Suppose
f (λ ) =∞
∑n=0
anλn (11.16)