It seems like most of the attention to the QR algorithm has to do with finding ways to get it to
“converge” faster. Great and marvelous are the clever tricks which have been proposed to do
this but my intent is to present the basic ideas, not to go in to the numerous refinements of
this algorithm. However, there is one thing which should be done. It involves reducing to the
case of an upper Hessenberg matrix which is one which is zero below the main sub diagonal.
The following shows that any square matrix is unitarily similar to such an upper Hessenberg
matrix.
Let A be an invertible n × n matrix. Let Q1′ be a unitary matrix
Now let Q2′ be the n − 2 × n − 2 matrix which does to the first column of A1 the same sort of thing
that the n − 1 × n − 1 matrix Q1′ did to the first column of A. Let
( )
I 0
Q2 ≡ 0 Q′
2
where I is the 2 × 2 identity. Then applying block multiplication,
This matrix equals zero below the subdiagonal. It is called an upper Hessenberg matrix.
It happens that in the QR algorithm, if Ak is upper Hessenberg, so is Ak+1. To see this, note that the
matrix is upper Hessenberg means that Aij = 0 whenever i − j ≥ 2.
Ak+1 = RkQk
where Ak = QkRk. Therefore as shown before,
−1
Ak+1 = RkAkR k
Let the ijth entry of Ak be aijk. Then if i − j ≥ 2
n j
ak+1= ∑ ∑ ripakr−1
ij p=iq=1 pq qj
It is given that apqk = 0 whenever p − q ≥ 2. However, from the above sum,
p− q ≥ i− j ≥ 2
and so the sum equals 0.
Since upper Hessenberg matrices stay that way in the algorithm and it is closer to being upper
triangular, it is reasonable to suppose the QR algorithm will yield good results more quickly for this upper
Hessenberg matrix than for the original matrix. This would be especially true if the matrix is good sized.
The other important thing to observe is that, starting with an upper Hessenberg matrix, the algorithm will
restrict the size of the blocks which occur to being 2 × 2 blocks which are easy to deal with. These blocks
allow you to identify the eigenvalues.
Note how it is symmetric also. This will always happen when you begin with a symmetric matrix. Now use
the QR algorithm on this matrix. The syntax is as follows in Matlab.
H= [enter H here]
hold on
for k=1:100
[Q,R ]=qr (H );
H=R*Q;
end
Q
R
H
You already have H and matlab knows about it so you don’t need to enter H again. This yields the
following matrix similar to the original one.
You might want to check that the product of these equals the determinant of the matrix and that the sum
equals the trace of the matrix. In fact, this works out very well. To find eigenvectors, you could use the
shifted inverse power method. They will be different for the Hessenberg matrix than for the original matrix
A.