12.1 Regular Markov Matrices
The existence of the Jordan form is the basis for the proof of limit theorems for certain kinds of matrices
called Markov matrices.
Definition 12.1.1 An n × n matrix A =
, is a Markov matrix if aij ≥
0 for all i,j
It may also be called a stochastic matrix or a transition matrix. A Markov or stochastic matrix is called
regular if some power of A has all entries strictly positive. A vector v ∈ ℝn, is a steady state if
Av = v.
Lemma 12.1.2 The property of being a stochastic matrix is preserved by taking products. It is also
true if the sum is of the form ∑
jaij = 1.
Proof: Suppose the sum over a row equals 1 for A and B. Then letting the entries be denoted by
respectively and the entries of
It is obvious that when the product is taken, if each aij,bij ≥ 0, then the same will be true of sums
of products of these numbers. Similar reasoning works for the assumption that ∑
jaij = 1.
The following theorem is convenient for showing the existence of limits.
Theorem 12.1.3 Let A be a real p × p matrix having the properties
- aij ≥ 0
- Either ∑
i=1paij = 1 or ∑
j=1paij = 1.
- The distinct eigenvalues of A are
Then limn→∞An = A∞ exists in the sense that limn→∞ aijn = aij∞, the ijth entry A∞. Here aijn
denotes the ijth entry of An. Also, if λ = 1 has algebraic multiplicity r, then the Jordan block
corresponding to λ = 1 is just the r × r identity.
Proof. By the existence of the Jordan form for A, it follows that there exists an invertible matrix P
where I is r ×r for r the multiplicity of the eigenvalue 1 and N is a nilpotent matrix for which Nr = 0. I
will show that because of Condition 2, N = 0.
First of all,
where Ni satisfies Niri = 0 for some ri > 0. It is clear that Ni
which converges to 0 due to the assumption that
1. There are finitely many terms and a typical one
is a matrix whose entries are no larger than an expression of the form
which converges to 0 because, by the root test, the series ∑
converges. Thus for each
By Condition 2, if aijn denotes the ijth entry of An, then either
This follows from Lemma 12.1.2. It is obvious each aijn ≥ 0, and so the entries of An must be bounded
independent of n.
It follows easily from
Hence Jn must also have bounded entries as n →∞. However, this requirement is incompatible with an
assumption that N≠0.
If N≠0, then Ns≠0 but Ns+1 = 0 for some 1 ≤ s ≤ r. Then
One of the entries of Ns is nonzero by the definition of s. Let this entry be nijs. Then this implies that one
of the entries of
is of the form
This entry dominates the ijth
k < s
Therefore, the entries of
cannot all be bounded. From block multiplication,
and this is a contradiction because entries are bounded on the left and unbounded on the
Since N = 0, the above equation implies limn→∞An exists and equals
Are there examples which will cause the eigenvalue condition of this theorem to hold? The following
lemma gives such a condition. It turns out that if aij > 0, not just ≥ 0, then the eigenvalue condition of the
above theorem is valid.
Lemma 12.1.4 Suppose A =
is a stochastic matrix. Then λ
= 1 is an eigenvalue. If aij >
for all i,j, then if μ is an eigenvalue of A, either
1 or μ
Proof: First consider the claim that 1 is an eigenvalue. By definition,
and so ATv = v where v =
have the same eigenvalues, this shows 1 is an
eigenvalue of A
. Suppose then that μ
is an eigenvalue. Is
1 or μ
= 1? Let v
be an eigenvector for AT
be the largest of the
and now multiply both sides by μvi to obtain
then equality must hold in the above, and so vjviμ
must be real and
nonnegative for each j.
In particular, this holds for j
which shows μ
is real and nonnegative. Thus, in
this case, μ
= 1 because μ
is nonnegative and equal to 1. The only other case is where
The next lemma is sort of a conservation result. It says the sign and sum of entries of a vector are
preserved when multiplying by a Markov matrix.
Lemma 12.1.5 Let A be any Markov matrix and let v be a vector having all its components non
negative with ∑
ivi = c. Then if w = Av, it follows that wi ≥ 0 for all i and ∑
iwi = c.
Proof: From the definition of w,
The following theorem about limits is now easy to obtain.
Theorem 12.1.6 Suppose A is a Markov matrix in which aij > 0 for all i,j and suppose w is a vector.
Then for each i,
where Av = v. In words, Akw always converges to a steady state. In addition to this, if the vector w
satisfies wi ≥ 0 for all i and ∑
iwi = c, then the vector v will also satisfy the conditions, vi ≥ 0,
ivi = c.
Proof: By Lemma 12.1.4, since each aij > 0, the eigenvalues are either 1 or have absolute
value less than 1. Therefore, the claimed limit exists by Theorem 12.1.3. The assertion that the
components are nonnegative and sum to c follows from Lemma 12.1.5. That Av = v follows
It is not hard to generalize the conclusion of this theorem to regular Markov processes which are those
having some power with all positive entries.
Corollary 12.1.7 Suppose A is a regular Markov matrix, one for which the entries of Ak are all positive
for some k, and suppose w is a vector. Then for each i,
where Av = v. In words, Anw always converges to a steady state. In addition to this, if the vector w
satisfies wi ≥ 0 for all i and ∑
iwi = c, Then the vector v will also satisfy the conditions vi ≥ 0,
ivi = c.
Proof: Let the entries of Ak be all positive for some k. Now suppose that aij ≥ 0 for all i,j and
is a Markov matrix. Then if
is a Markov matrix with
0 for all ij,
it follows that
is a Markov matrix which has strictly positive entries. This is because the ijth
entry of BA
Thus, from Lemma 12.1.4, Ak has eigenvalues
1. The same must be true of A
and so either μk
= 1 or
1. If μk
= 1 and
the eigenvalues of
are either 1 or have absolute value less than 1 because Ak+1
has all postive entries
thanks to Lemma 12.1.4
. Thus μk+1
= 1 and so
By Theorem 12.1.3, limn→∞Anw exists. The rest follows as in Theorem 12.1.6. ■