17.11 Square Roots
In this section, H will be a Hilbert space, real or complex, and T will denote an operator
which satisfies the following definition. A useful theorem about the existence of square
roots of certain operators is presented. This proof is very elementary. I found it in
Definition 17.11.1 Let T ∈ℒ
= T∗ (Hermitian) and for all
x ∈ H,
Such an operator is referred to as positive and self adjoint. It is probably better to refer to
such an operator as “nonnegative” since the possibility that Tx = 0 for some x≠0 is not
being excluded. Instead of “self adjoint” you can also use the term, Hermitian. To save
on notation, write
to mean T is positive, satisfying 17.11.70.
With the above definition here is a fundamental result about positive self adjoint
Proposition 17.11.2 Let S,T be positive and self adjoint such that ST = TS.
Then ST is also positive and self adjoint.
Proof: It is obvious that ST is self adjoint. The only problem is to show that ST is
positive. To show this, first suppose S ≤ I. The idea is to write
where S0 = S and the operators Sk are self adjoint. This is a useful idea because it is
then obvious that the sum is positive. If we want such a representation as above, then it
follows that S0 ≡ S and
Thus it is obvious that the Sk are all self adjoint. Also, the following claim
Claim: I ≥ Sn ≥ 0.
Proof of the claim: This is true if n = 0. Assume true for n. Then from the
and it is obvious from the definition that the sum of positive operators is positive.
Therefore, it suffices to show the two terms in the above are both positive. It is clear
from the definition that each Sn is Hermitian (self adjoint) because they are
just polynomials in S. Also each must commute with T for the same reason.
This proves the claim.
Now each Sk commutes with T because this is true of S0 and succeding Sk are
polynomials in terms of S0. Therefore,
From the claim,
and so limn→∞Snx = 0. Hence from 17.11.71,
All this was based on the assumption that S ≤ I. The next task is to remove this
assumption. Let ST = TS where T and S are positive self adjoint operators. Then
This is still a positive self adjoint operator and it commutes with T
does. Therefore, from the first part,
The proposition is like the familiar statement about real numbers which says that
when you multiply two nonnegative real numbers the result is a nonnegative real
number. The next lemma is a generalization of the familiar fact that if you have an
increasing sequence of real numbers which is bounded above, then the sequence
Lemma 17.11.3 Let
be a sequence of self adjoint operators on a Hilbert space, H
and let Tn ≤ Tn+1 for all n. Also suppose there exists K, a self adjoint operator such that
for all n,Tn ≤ K. Suppose also that each operator commutes with all the others and that
K commutes with all the Tn. Then there exists a self adjoint continuous operator, T such
that for all x ∈ H,
T ≤ K, and T commutes with all the Tn and with K.
Proof: Consider K − Tn ≡ Sn. Then the
are decreasing, that is,
is a decreasing sequence and from the hypotheses, Sn ≥ 0 so the above sequence is
bounded below by 0. Therefore, limn→∞
exists. By Proposition
n > m,
Therefore, since Sn is self adjoint,
The last step follows from an application of the Cauchy Schwarz inequality along with
the fact Sm −Sn ≥ 0. The last expression converges to 0 because limn→∞
is a Cauchy sequence. Let
be the thing to which it
is obviously linear and
and so TK = KT. Similarly, T commutes with all Tn.
In order to show T is continuous, apply the uniform boundedness principle, Theorem
15.1.8. The convergence of
implies there exists a uniform bound on the norms,
Now take the limit as n →∞ to conclude
. This proves the lemma.
With this preparation, here is the theorem about square roots.
Theorem 17.11.4 Let T ∈ ℒ
be a positive self adjoint linear operator.
Then there exists a unique square root, A with the following properties. A2
is positive and self adjoint, A commutes with every operator which commutes with
Proof: First suppose T ≤ I. Then define
From this it follows that every An is a polynomial in T. Therefore, An commutes with T
and with every operator which commutes with T.
Claim 1: An ≤ I.
Proof of Claim 1: This is true if n = 0. Suppose it is true for n. Then by the
assumption that T ≤ I,
Claim 2: An ≤ An+1
Proof of Claim 2: From the definition of An, this is true if n = 0 because
Suppose true for n. Then from Claim 1,
Claim 3: An ≥ 0
Proof of Claim 3: This is true if n = 0. Suppose it is true for n.
because An − An2
0 by Proposition 17.11.2
is a sequence of positive self adjoint operators which are bounded above by
such that each of these operators commutes with every operator which commutes with
. By Lemma 17.11.3
, there exists a bounded linear operator, A
such that for all
Then A commutes with every operator which commutes with T because each An has this
property. Also A is a positive operator because each An is. From passing to the limit in
the definition of An,
and so Tx = A2x. This proves the theorem in the case that T ≤ I.
In the general case, consider T∕
and so T∕
Therefore, it has a square root, B.
the right properties and A2
This proves the existence
part of the theorem.
Next suppose both A and B are square roots of T having all the properties stated in
the theorem. Then AB = BA because both A and B commute with every operator which
commutes with T.
Therefore, on adding these,
It follows both expressions in 17.11.72
equal 0 since both are nonnegative and
when they are added the result is 0. Now applying the existence part of the
theorem to A,
there exists a positive square root of A
which is self adjoint.
= 0 which implies A
Subtracting these and taking the inner product with x,
and so Ax = Bx which shows A = B since x was arbitrary. This proves the
+ class=”left” align=”middle”(U)17.12. ORDINARY DIFFERENTIAL
EQUATIONS IN BANACH SPACE