10.1. REGULAR MARKOV MATRICES 265
Since N = 0, the above equation implies limn→∞An exists and equals
P
I
0. . .
0
P−1 ■
Are there examples which will cause the eigenvalue condition of this theorem to hold?The following lemma gives such a condition. It turns out that if aij > 0, not just ≥ 0, thenthe eigenvalue condition of the above theorem is valid.
Lemma 10.1.4 Suppose A = (aij) is a stochastic matrix. Then λ = 1 is an eigenvalue. Ifaij > 0 for all i, j, then if µ is an eigenvalue of A, either |µ| < 1 or µ = 1.
Proof: First consider the claim that 1 is an eigenvalue. By definition,∑i
1aij = 1
and so ATv = v where v =(
1 · · · 1)T
. Since A,AT have the same eigenvalues, this
shows 1 is an eigenvalue. Suppose then that µ is an eigenvalue. Is |µ| < 1 or µ = 1? Let vbe an eigenvector for AT and let |vi| be the largest of the |vj | .
µvi =∑j
ajivj
and now multiply both sides by µvi to obtain
|µ|2 |vi|2 =∑j
ajivjµvi =∑j
aji Re (vjµvi)
≤∑j
aji |vi|2 |µ| = |µ| |vi|2
Therefore, |µ| ≤ 1. If |µ| = 1, then equality must hold in the above, and so vjviµ mustbe real and nonnegative for each j. In particular, this holds for j = i which shows µ is realand nonnegative. Thus, in this case, µ = 1 because µ̄ = µ is nonnegative and equal to 1.The only other case is where |µ| < 1. ■
Lemma 10.1.5 Let A be any Markov matrix and let v be a vector having all its componentsnon negative with
∑i vi = c. Then if w = Av, it follows that wi ≥ 0 for all i and
∑i wi = c.
Proof: From the definition of w,
wi ≡∑j
aijvj ≥ 0.
Also ∑i
wi =∑i
∑j
aijvj =∑j
∑i
aijvj =∑j
vj = c. ■
The following theorem about limits is now easy to obtain.