The existence of the Jordan form is the basis for the proof of limit theorems for certain kinds of matrices called Markov matrices.
Definition 10.1.1 An n × n matrix A =

It may also be called a stochastic matrix or a transition matrix. A Markov or stochastic matrix is called regular if some power of A has all entries strictly positive. A vector v ∈ ℝ^{n}, is a steady state if Av = v.
Lemma 10.1.2 The property of being a stochastic matrix is preserved by taking products. It is also true if the sum is of the form ∑ _{j}a_{ij} = 1.
Proof: Suppose the sum over a row equals 1 for A and B. Then letting the entries be denoted by

It is obvious that when the product is taken, if each a_{ij},b_{ij} ≥ 0, then the same will be true of sums of products of these numbers. Similar reasoning works for the assumption that ∑ _{j}a_{ij} = 1. ■
The following theorem is convenient for showing the existence of limits.
Theorem 10.1.3 Let A be a real p × p matrix having the properties
Then lim_{n→∞}A^{n} = A_{∞} exists in the sense that lim_{n→∞} a_{ij}^{n} = a_{ij}^{∞}, the ij^{th} entry A_{∞}. Here a_{ij}^{n} denotes the ij^{th} entry of A^{n}. Also, if λ = 1 has algebraic multiplicity r, then the Jordan block corresponding to λ = 1 is just the r × r identity.
Proof. By the existence of the Jordan form for A, it follows that there exists an invertible matrix P such that

where I is r × r for r the multiplicity of the eigenvalue 1 and N is a nilpotent matrix for which N^{r} = 0. I will show that because of Condition 2, N = 0.
First of all,

where N_{i} satisfies N_{i}^{ri} = 0 for some r_{i} > 0. It is clear that N_{i}

which converges to 0 due to the assumption that

which converges to 0 because, by the root test, the series ∑ _{n=1}^{∞}

By Condition 2, if a_{ij}^{n} denotes the ij^{th} entry of A^{n}, then either

This follows from Lemma 10.1.2. It is obvious each a_{ij}^{n} ≥ 0, and so the entries of A^{n} must be bounded independent of n.
It follows easily from

that
 (10.1) 
Hence J^{n} must also have bounded entries as n →∞. However, this requirement is incompatible with an assumption that N≠0.
If N≠0, then N^{s}≠0 but N^{s+1} = 0 for some 1 ≤ s ≤ r. Then

One of the entries of N^{s} is nonzero by the definition of s. Let this entry be n_{ij}^{s}. Then this implies that one of the entries of

Therefore, the entries of

and this is a contradiction because entries are bounded on the left and unbounded on the right.
Since N = 0, the above equation implies lim_{n→∞}A^{n} exists and equals

Are there examples which will cause the eigenvalue condition of this theorem to hold? The following lemma gives such a condition. It turns out that if a_{ij} > 0, not just ≥ 0, then the eigenvalue condition of the above theorem is valid.
Proof: First consider the claim that 1 is an eigenvalue. By definition,

and so A^{T}v = v where v =

and now multiply both sides by μv_{i} to obtain
Therefore,
Lemma 10.1.5 Let A be any Markov matrix and let v be a vector having all its components non negative with ∑ _{i}v_{i} = c. Then if w = Av, it follows that w_{i} ≥ 0 for all i and ∑ _{i}w_{i} = c.
Proof: From the definition of w,

Also

The following theorem about limits is now easy to obtain.
Theorem 10.1.6 Suppose A is a Markov matrix in which a_{ij} > 0 for all i,j and suppose w is a vector. Then for each i,

where Av = v. In words, A^{k}w always converges to a steady state. In addition to this, if the vector w satisfies w_{i} ≥ 0 for all i and ∑ _{i}w_{i} = c, then the vector v will also satisfy the conditions, v_{i} ≥ 0, ∑ _{i}v_{i} = c.
Proof: By Lemma 10.1.4, since each a_{ij} > 0, the eigenvalues are either 1 or have absolute value less than 1. Therefore, the claimed limit exists by Theorem 10.1.3. The assertion that the components are nonnegative and sum to c follows from Lemma 10.1.5. That Av = v follows from

It is not hard to generalize the conclusion of this theorem to regular Markov processes.
Corollary 10.1.7 Suppose A is a regular Markov matrix, one for which the entries of A^{k} are all positive for some k, and suppose w is a vector. Then for each i,

where Av = v. In words, A^{n}w always converges to a steady state. In addition to this, if the vector w satisfies w_{i} ≥ 0 for all i and ∑ _{i}w_{i} = c, Then the vector v will also satisfy the conditions v_{i} ≥ 0, ∑ _{i}v_{i} = c.
Proof: Let the entries of A^{k} be all positive for some k. Now suppose that a_{ij} ≥ 0 for all i,j and A =

Thus, from Lemma 10.1.4, A^{k} has an eigenvalue equal to 1 for all k sufficiently large, and all the other eigenvalues have absolute value strictly less than 1. The same must be true of A. If v≠0 and Av = λv and

By Theorem 10.1.3, lim_{n→∞}A^{n}w exists. The rest follows as in Theorem 10.1.6. ■