A.1 Use Of Matrix Calculator On Web
There is a really nice service on the web which will do all of these things very easily. It is
www.bluebit.gr/matrix-calculator/ To get to it, you can use the address or google matrix
When you go to this site, you enter a matrix row by row, placing a space between each
number. When you come to the end of a row, you press enter on the keyboard to start the next
row. After entering the matrix, you select what you want it to do. You will see that it also solves
systems of equations.
Earlier theorems about Markov matrices were presented. These were matrices in which all the
entries were nonnegative and either the columns or the rows added to 1. It turns out that many of
the theorems presented can be generalized to positive matrices. When this is done, the resulting
theory is mainly due to Perron and Frobenius. I will give an introduction to this theory here
following Karlin and Taylor .
Definition B.0.1 For A a matrix or vector, the notation, A >> 0 will mean every entry
of A is positive. By A > 0 is meant that every entry is nonnegative and at least one is
positive. By A ≥ 0 is meant that every entry is nonnegative. Thus the matrix or vector
consisting only of zeros is ≥ 0. An expression like A >> B will mean A − B >> 0 with
similar modifications for > and ≥.
For the sake of this section only, define the following for x =
T, a vector.
is the vector which results by replacing each entry of x with its absolute
Also define for x ∈ ℂn,
Lemma B.0.2 Let A >> 0 and let x > 0. Then Ax >> 0.
0 because all the Aij >
0 and at least one xj >
Lemma B.0.3 Let A >> 0. Define
Proof: Let λ ∈ S. Then there exists x >> 0 such that Ax > λx. Consider y ≡ x∕
= 1 and Ay > λy
. Therefore, λ ∈ S1
and so S ⊆ S1.
Now let λ ∈ S1. Then there exists x ≥ 0 such that
= 1 so x > 0
and Ax > λx.
y ≡ Ax,
it follows from Lemma B.0.2
that Ay >> λy
and y >> 0
. Thus λ ∈ S
and so S1 ⊆ S
which shows that sup
This lemma is significant because the set,
is a compact set
The following theorem is due to Perron.
Theorem B.0.4 Let A >> 0 be an n × n matrix and let λ0 be given in 2.1. Then
- λ0 > 0 and there exists x0>> 0 such that Ax0 = λ0x0 so λ0 is an eigenvalue for A.
- If Ax = μx where x≠0, and μ≠λ0. Then
- The eigenspace for λ0 has dimension 1.
Proof: To see λ0 > 0, consider the vector, e ≡
and so λ0 is at least as large as
be an increasing sequence of numbers from
converging to λ0.
be the vector
which occurs in the definition of S1,
these vectors are in a compact set. Therefore, there
exists a subsequence, still denoted by xk
such that xk → x0 ∈ K
and λk → λ0
. Then passing to
If Ax0 > λ0x0, then letting y ≡ Ax0, it follows from Lemma B.0.2 that Ay >> λ0y and
y >> 0. But this contradicts the definition of λ0 as the supremum of the elements of S
because since Ay >> λ0y, it follows Ay >>
a small positive number.
. It remains to verify that x0 >> 0
. But this follows immediately
This proves 1.
Next suppose Ax = μx and x≠0 and μ≠λ0. Then
But this implies
(See the above abominable definition of
≠ − x.
In this case, A
it follows y >> 0
which shows Ay >>
for sufficiently small positive ε
In this case, the entries of x are all real and have the same sign. Therefore, A
Now let y ≡
But also, the fact
the entries of x
all have the same sign shows μ
μ ∈ S1
. Since μ≠λ0,
it must be that
This proves 2
It remains to verify 3. Suppose then that Ay = λ0y and for all scalars α,αx0≠y.
If Rey = α1x0 and Imy = α2x0 for real numbers, αi,then y =
and it is assumed
this does not happen. Therefore, either
Assume the first holds. Then varying t ∈ ℝ, there exists a value of t such that x0 + tRey > 0 but
it is not the case that x0 + tRey >> 0. Then A
0 by Lemma B.0.2
. But this
0 which is a contradiction. Hence there exist real numbers, α1
such that Rey
showing that y
It is possible to obtain a simple corollary to the above theorem.
Corollary B.0.5 If A > 0 and Am >> 0 for some m ∈ ℕ, then all the conclusions of the
above theorem hold.
Proof: There exists μ0 > 0 such that Amy0 = μ0y0 for y0 >> 0 by Theorem B.0.4
Let λ0m = μ0. Then
and so letting x0 ≡
it follows x0 >>
0 and Ax0
Suppose now that Ax = μx for x≠0 and μ≠λ0. Suppose
Multiplying both sides by A,
it follows Amx
m ≥ λ0m
and so from Theorem B.0.4
is an eigenvalue of Am,
it follows that μm
But by Theorem B.0.4
again, this implies
for some scalar, c
and hence Ay0
Since y0 >> 0,
it follows μ ≥
0 and so μ
Finally, if Ax = λ0x, then Amx = λ0mx and so x = cy0 for some scalar, c. Consequently,
which shows the dimension of the eigenspace for λ0 is one. ■
The following corollary is an extremely interesting convergence result involving the powers of
Corollary B.0.6 Let A > 0 and Am >> 0 for some m ∈ ℕ. Then for λ0 given in 2.1,
there exists a rank one matrix P such that limm→∞
Proof: Considering AT, and the fact that A and AT have the same eigenvalues, Corollary
B.0.5 implies the existence of a vector, v >> 0 such that
Also let x0 denote the vector such that Ax0 = λ0x0 with x0 >> 0. First note that x0Tv > 0
because both these vectors have all entries positive. Therefore, v may be scaled such
Thanks to 2.2,
Continuing this way, using 2.3
repeatedly, it follows
The eigenvalues of
are of interest because it is powers of this matrix which
determine the convergence of
Therefore, let μ
be a nonzero eigenvalue of this matrix.
for x≠0, and μ≠0. Applying P to both sides and using the second formula of 2.3 yields
But since Px = 0, it follows from 2.6 that
which implies λ0μ is an eigenvalue of A. Therefore, by Corollary B.0.5 it follows that either
λ0μ = λ0 in which case μ = 1, or λ0
But if μ
multiple of x0
which says x0 − x0vTx0 = x0 and so by 2.2, x0 = 0 contrary to the property that x0 >> 0.
1 and so this has shown that the absolute values of all eigenvalues of
are less than 1. By Gelfand’s theorem, Theorem 13.3.3
, it follows
whenever m is large enough. Now by 2.5 this yields
whenever m is large enough. It follows
What about the case when A > 0 but maybe it is not the case that A >> 0? As
Theorem B.0.7 Let A > 0 and let λ0 be defined in 2.7. Then there exists x0 > 0 such
that Ax0 = λ0x0.
Proof: Let E consist of the matrix which has a one in every entry. Then from Theorem
B.0.4 it follows there exists xδ >> 0 ,
Now if α < δ
and so λ0δ ≥ λ0α because λ0δ is the sup of the second set and λ0α is the sup of the first. It follows
the limit, λ1 ≡ limδ→0+λ0δ exists. Taking a subsequence and using the compactness of
K, there exists a subsequence, still denoted by δ such that as δ → 0, xδ → x ∈ K.
and so, in particular, Ax ≥ λ1x and so λ1 ≤ λ0. But also, if λ ≤ λ0,
showing that λ0δ ≥ λ for all such λ. But then λ0δ ≥ λ0 also. Hence λ1 ≥ λ0, showing these two
numbers are the same. Hence Ax = λ0x. ■
If Am >> 0 for some m and A > 0, it follows that the dimension of the eigenspace for λ0 is
one and that the absolute value of every other eigenvalue of A is less than λ0. If it is
only assumed that A > 0, not necessarily >> 0, this is no longer true. However, there
is something which is very interesting which can be said. First here is an interesting
Lemma B.0.8 Let M be a matrix of the form
where A is an r ×r matrix and C is an
Proof: To verify the claim about the determinants, note
But it is clear from the method of Laplace expansion that
and from the multilinear properties of the determinant and row operations that
The case where M is upper block triangular is similar.
This immediately implies σ
Theorem B.0.9 Let A > 0 and let λ0 be given in 2.7. If λ is an eigenvalue for A such
λ0, then λ∕λ0 is a root of unity. Thus
= 1 for some m ∈ ℕ.
Proof: Applying Theorem B.0.7 to AT, there exists v > 0 such that ATv = λ0v. In the first
part of the argument it is assumed v >> 0. Now suppose Ax = λx,x≠0 and that
and it follows that if A
then since v >> 0,
a contradiction. Therefore,
It follows that
and so the complex numbers,
must have the same argument for every k,j because equality holds in the triangle inequality.
Therefore, there exists a complex number, μi such that
and so, letting r ∈ ℕ,
Summing on j yields
Also, summing 2.9 on j and using that λ is an eigenvalue for x, it follows from 2.8
From 2.10 and 2.11,
Now from 2.10
replaced by r −
Continuing this way,
and eventually, this shows