4.13 The Right Polar Decomposition
The right polar decomposition involves writing a matrix as a product of two other
matrices, one which preserves distances and the other which stretches and distorts. First
here are some lemmas.
Lemma 4.13.1 Let A be a Hermitian matrix such that all its eigenvalues are
nonnegative. Then there exists a Hermitian matrix, A1∕2 such that A1∕2 has all
nonnegative eigenvalues and
Proof: Since A is Hermitian, there exists a diagonal matrix D having all real
nonnegative entries and a unitary matrix U such that A = U∗DU. Then denote by D1∕2
the matrix which is obtained by replacing each diagonal entry of D with its square root.
Thus D1∕2D1∕2 = D. Then define
Since D1∕2 is real,
so A1∕2 is Hermitian. This proves the lemma.
There is also a useful observation about orthonormal sets of vectors which is stated in
the next lemma.
Lemma 4.13.2 Suppose
is an orthonormal set of vectors. Then if
,cr are scalars,
Proof: This follows from the definition. From the properties of the dot product and
using the fact that the given set of vectors is orthonormal,
This proves the lemma.
Next it is helpful to recall the Gram Schmidt algorithm and observe a certain
property stated in the next lemma.
Lemma 4.13.3 Suppose
is a linearly independent set
of vectors such that
is an orthonormal set of vectors. Then when the
Gram Schmidt process is applied to the vectors in the given order, it will not change
any of the w1,
be the orthonormal set delivered by the Gram
Schmidt process. Then
because by definition, u1 ≡ w1∕
for all j ≤ k ≤ r.
Then if k < r,
consider the definition of
By induction, uj = wj and so this reduces to wk+1∕
This proves the
This lemma immediately implies the following lemma.
Lemma 4.13.4 Let V be a subspace of dimension p and let
orthonormal set of vectors in V . Then this orthonormal set of vectors may be extended to
an orthonormal basis for V,
Proof: First extend the given linearly independent set
to a basis
and then apply the Gram Schmidt theorem to the resulting basis. Since
is orthonormal it follows from Lemma
the result is of the
desired form, an orthonormal basis extending
. This proves the
Here is another lemma about preserving distance.
Lemma 4.13.5 Suppose R is an m × n matrix with m > n and R preserves
distances. Then R∗R = I.
Proof: Since R preserves distances,
Therefore from the
axioms of the dot product,
and so for all x,y,
Hence for all x,y,
Now for a x,y given, choose α ∈ ℂ such that
= 0 for all
because the given x,y
were arbitrary. Let
= R∗Rx − x
to conclude that for all x,
which says R∗R = I since x is arbitrary. This proves the lemma.
With this preparation, here is the big theorem about the right polar decomposition.
Theorem 4.13.6 Let F be an m × n matrix where m ≥ n. Then there exists a
Hermitian n×n matrix, U which has all nonnegative eigenvalues and an m×n matrix,
R which preserves distances and satisfies R∗R = I such that
Proof: Consider F∗F. This is a Hermitian matrix because
Also the eigenvalues of the n×n matrix F∗F are all nonnegative. This is because if x is
Therefore, by Lemma 4.13.1, there exists an n × n Hermitian matrix, U having all
nonnegative eigenvalues such that
Consider the subspace U
be an orthonormal basis for
. Note that U
might not be all of
Using Lemma 4.13.4
, extend to an
orthonormal basis for all of Fn,
Next observe that
is also an orthonormal set of vectors in
. This is
Therefore, from Lemma 4.13.4
again, this orthonormal set of vectors can be extended to
an orthonormal basis for Fm,
Thus there are at least as many zk as there are yj. Now for x ∈ Fn, since
is an orthonormal basis for Fn, there exist unique scalars,
Then also there exist scalars bk such that
and so from 4.13.42,applied to Ux in place of x
and this shows
From 4.13.42 and Lemma 4.13.2 R preserves distances. Therefore, by Lemma 4.13.5
R∗R = I. This proves the theorem.