10.1. REGULAR MARKOV MATRICES 265

Since N = 0, the above equation implies limn→∞An exists and equals

P

I

0. . .

0

P−1 ■

Are there examples which will cause the eigenvalue condition of this theorem to hold?The following lemma gives such a condition. It turns out that if aij > 0, not just ≥ 0, thenthe eigenvalue condition of the above theorem is valid.

Lemma 10.1.4 Suppose A = (aij) is a stochastic matrix. Then λ = 1 is an eigenvalue. Ifaij > 0 for all i, j, then if µ is an eigenvalue of A, either |µ| < 1 or µ = 1.

Proof: First consider the claim that 1 is an eigenvalue. By definition,∑i

1aij = 1

and so ATv = v where v =(

1 · · · 1)T

. Since A,AT have the same eigenvalues, this

shows 1 is an eigenvalue. Suppose then that µ is an eigenvalue. Is |µ| < 1 or µ = 1? Let vbe an eigenvector for AT and let |vi| be the largest of the |vj | .

µvi =∑j

ajivj

and now multiply both sides by µvi to obtain

|µ|2 |vi|2 =∑j

ajivjµvi =∑j

aji Re (vjµvi)

≤∑j

aji |vi|2 |µ| = |µ| |vi|2

Therefore, |µ| ≤ 1. If |µ| = 1, then equality must hold in the above, and so vjviµ mustbe real and nonnegative for each j. In particular, this holds for j = i which shows µ is realand nonnegative. Thus, in this case, µ = 1 because µ̄ = µ is nonnegative and equal to 1.The only other case is where |µ| < 1. ■

Lemma 10.1.5 Let A be any Markov matrix and let v be a vector having all its componentsnon negative with

∑i vi = c. Then if w = Av, it follows that wi ≥ 0 for all i and

∑i wi = c.

Proof: From the definition of w,

wi ≡∑j

aijvj ≥ 0.

Also ∑i

wi =∑i

∑j

aijvj =∑j

∑i

aijvj =∑j

vj = c. ■

The following theorem about limits is now easy to obtain.

10.1. REGULAR MARKOV MATRICES 265Since N = 0, the above equation implies lim,_,,. A” exists and equalsI0Are there examples which will cause the eigenvalue condition of this theorem to hold?The following lemma gives such a condition. It turns out that if aj; > 0, not just > 0, thenthe eigenvalue condition of the above theorem is valid.Lemma 10.1.4 Suppose A = (aj;) is a stochastic matrix. Then X= 1 is an eigenvalue. Ifai; > 0 for all i,j, then if p is an eigenvalue of A, either |u| <1 or =1.Proof: First consider the claim that 1 is an eigenvalue. By definition,So lai; =1Tand so A’v = v where v = ( 1: 1 ) . Since A, A’ have the same eigenvalues, thisshows 1 is an eigenvalue. Suppose then that js is an eigenvalue. Is |u| < 1 or pp = 1? Let vbe an eigenvector for A? and let |v;| be the largest of the |v,|.LY; = S- Q5iU5Jand now multiply both sides by fu; to obtainYo 0077; = Y7 aj Re (jp)F -j2 2< So agi leit” el = [al leJ| 22el” eiATherefore, |u| < 1. If |j| = 1, then equality must hold in the above, and so v,;0;/t mustbe real and nonnegative for each j. In particular, this holds for 7 = 7 which shows 7 is realand nonnegative. Thus, in this case, = 1 because ff = yz is nonnegative and equal to 1.The only other case is where |u| <1.Lemma 10.1.5 Let A be any Markov matrix and let v be a vector having all its componentsnon negative with >), vi; = c. Then if w = Av, it follows that w; > 0 for alli and 0; wi = c.Proof: From the definition of w,Ww; = So 4450; > 0.jAlso) wi=)5 ; aijv; =) ; aijUj = ; v;=c. i7 tj jt jThe following theorem about limits is now easy to obtain.