300 CHAPTER 11. LIMITS OF VECTORS AND MATRICES

are each eigenvalues of(

Aλ 0

)which has only finitely many and hence this sequence must

repeat. Therefore,(

λ

λ 0

)is a root of unity as claimed. This proves the theorem in the case

that v >> 0.Now it is necessary to consider the case where v > 0 but it is not the case that v >> 0.

Then in this case, there exists a permutation matrix P such that

Pv =



v1...

vr

0...0

≡

(u

0

)≡ v1

Then λ 0v = ATv = AT Pv1. Therefore, λ 0v1 = PAT Pv1 = Gv1. Now P2 = I because it isa permutation matrix. Therefore, the matrix G ≡ PAT P and A are similar. Consequently,they have the same eigenvalues and it suffices from now on to consider the matrix G ratherthan A. Then

λ 0

(u

0

)=

(M1 M2

M3 M4

)(u

0

)where M1 is r× r and M4 is (n− r)× (n− r) . It follows from block multiplication and theassumption that A and hence G are > 0 that

G =

(A′ B0 C

).

Now let λ be an eigenvalue of G such that |λ | = λ 0. Then from Lemma 11.4.8, eitherλ ∈ σ (A′) or λ ∈ σ (C) . Suppose without loss of generality that λ ∈ σ (A′) . Since A′ > 0it has a largest positive eigenvalue λ

′0 which is obtained from 11.11. Thus λ

′0 ≤ λ 0 but λ

being an eigenvalue of A′, has its absolute value bounded by λ′0 and so λ 0 = |λ | ≤ λ

′0 ≤ λ 0

showing that λ 0 ∈ σ (A′) . Now if there exists v >> 0 such that A′Tv = λ 0v, then the firstpart of this proof applies to the matrix A and so (λ/λ 0) is a root of unity. If such a vector,v does not exist, then let A′ play the role of A in the above argument and reduce to theconsideration of

G′ ≡

(A′′ B′

0 C′

)where G′ is similar to A′ and λ ,λ 0 ∈ σ (A′′) . Stop if A′′Tv = λ 0v for some v >> 0.Otherwise, decompose A′′ similar to the above and add another prime. Continuing this wayyou must eventually obtain the situation where (A′···′)T v = λ 0v for some v >> 0. Indeed,this happens no later than when A′···′ is a 1×1 matrix. ■

11.5 Functions Of MatricesThe existence of the Jordan form also makes it possible to define various functions ofmatrices. Suppose

f (λ ) =∞

∑n=0

anλn (11.16)

300 CHAPTER 11. LIMITS OF VECTORS AND MATRICESare each eigenvalues of (4) which has only finitely many and hence this sequence mustrepeat. Therefore, (4) is a root of unity as claimed. This proves the theorem in the casethat v >> 0.Now it is necessary to consider the case where v > O but it is not the case that v >> 0.Then in this case, there exists a permutation matrix P such thatVIPv vr u =v0 00Then Agv =A? v =A! Puy. Therefore, Agu; = PA? Pu; = Gu. Now P2 =1 because it isa permutation matrix. Therefore, the matrix G = PA’ P and A are similar. Consequently,they have the same eigenvalues and it suffices from now on to consider the matrix G ratherthan A. ThenU M, M) uAo =0 Mz; M4 0where M is r x r and Mg is (n—r) x (n—r). It follows from block multiplication and theassumption that A and hence G are > 0 that/o-(4 al0 CNow let A be an eigenvalue of G such that |A| = Ao. Then from Lemma 11.4.8, eitherA €0(A’) or A € o(C). Suppose without loss of generality that A € 0 (A’). Since A’ > 0it has a largest positive eigenvalue Ap which is obtained from 11.11. Thus AQ < Ao but Abeing an eigenvalue of A’, has its absolute value bounded by A and so Ap = |A| < AG < Aoshowing that Ao € o (A’). Now if there exists v >> 0 such that A’ v = Aov, then the firstpart of this proof applies to the matrix A and so (A/A9) is a root of unity. If such a vector,v does not exist, then let A’ play the role of A in the above argument and reduce to theconsideration ofG! _ A” B'0 Cwhere G’ is similar to A’ and A,A9 € 0 (A”). Stop if A’” uv = Aov for some v >> 0.Otherwise, decompose A” similar to the above and add another prime. Continuing this wayyou must eventually obtain the situation where (A’"”)’ v = Agv for some v >> 0. Indeed,this happens no later than when A’”” is a 1 x 1 matrix.11.5 Functions Of MatricesThe existence of the Jordan form also makes it possible to define various functions ofmatrices. Supposef(a)= Yay" (11.16)n=0