24.6 Square Roots
In this section, H will be a Hilbert space, real or complex, and T will denote an operator
which satisfies the following definition. A useful theorem about the existence of square
roots of certain operators is presented. This proof is very elementary. I found it in
Definition 24.6.1 Let T ∈ℒ
= T∗ (Hermitian) and for all
x ∈ H,
Such an operator is referred to as positive and self adjoint. It is probably better to refer to
such an operator as “nonnegative” since the possibility that Tx = 0 for some x≠0 is not being
excluded. Instead of “self adjoint” you can also use the term, Hermitian. To save on
to mean T is positive, satisfying 24.30.
With the above definition here is a fundamental result about positive self adjoint
Proposition 24.6.2 Let S,T be positive and self adjoint such that ST = TS. Then
ST is also positive and self adjoint.
Proof: It is obvious that ST is self adjoint. The only problem is to show that ST is
positive. To show this, first suppose S ≤ I. The idea is to write
where S0 = S and the operators Sk are self adjoint. This is a useful idea because it is then
obvious that the sum is positive. If we want such a representation as above, then it follows
that S0 ≡ S and
Thus it is obvious that the Sk are all self adjoint. Also, the following claim holds.
Claim: I ≥ Sn ≥ 0.
Proof of the claim: This is true if n = 0. Assume true for n. Then from the
and it is obvious from the definition that the sum of positive operators is positive. Therefore,
it suffices to show the two terms in the above are both positive. It is clear from the definition
that each Sn is Hermitian (self adjoint) because they are just polynomials in S. Also each
must commute with T for the same reason. Therefore,
This proves the claim.
Now each Sk commutes with T because this is true of S0 and succeding Sk are
polynomials in terms of S0. Therefore,
From the claim,
and so limn→∞Snx = 0. Hence from 24.31,
All this was based on the assumption that S ≤ I. The next task is to remove this
assumption. Let ST = TS where T and S are positive self adjoint operators. Then consider
This is still a positive self adjoint operator and it commutes with T
just like S
Therefore, from the first part,
The proposition is like the familiar statement about real numbers which says that
when you multiply two nonnegative real numbers the result is a nonnegative real
number. The next lemma is a generalization of the familiar fact that if you have an
increasing sequence of real numbers which is bounded above, then the sequence
Lemma 24.6.3 Let
be a sequence of self adjoint operators on a Hilbert space, H
and let Tn ≤ Tn+1 for all n. Also suppose there exists K, a self adjoint operator such that for
all n,Tn ≤ K. Suppose also that each operator commutes with all the others and that K
commutes with all the Tn. Then there exists a self adjoint continuous operator, T such that
for all x ∈ H,
T ≤ K, and T commutes with all the Tn and with K.
Proof: Consider K − Tn ≡ Sn. Then the
are decreasing, that is,
is a decreasing sequence and from the hypotheses, Sn ≥ 0 so the above sequence is
bounded below by 0. Therefore, limn→∞
exists. By Proposition
n > m,
Therefore, since Sn is self adjoint,
The last step follows from an application of the Cauchy Schwarz inequality along with the
fact Sm −Sn ≥ 0. The last expression converges to 0 because limn→∞
exists for each
is a Cauchy sequence. Let
be the thing to which it converges. T
obviously linear and
and so TK = KT. Similarly, T commutes with all Tn.
In order to show T is continuous, apply the uniform boundedness principle, Theorem
23.1.8. The convergence of
implies there exists a uniform bound on the norms,
Now take the limit as n →∞ to conclude
With this preparation, here is the theorem about square roots.
Theorem 24.6.4 Let T ∈ℒ
be a positive self adjoint linear operator.
Then there exists a unique square root, A with the following properties. A2
= T,A is
positive and self adjoint, A commutes with every operator which commutes with T.
Proof: First suppose T ≤ I. Then define
From this it follows that every An is a polynomial in T. Therefore, An commutes with T and
with every operator which commutes with T.
Claim 1: An ≤ I.
Proof of Claim 1: This is true if n = 0. Suppose it is true for n. Then by the assumption
that T ≤ I,
Claim 2: An ≤ An+1
Proof of Claim 2: From the definition of An, this is true if n = 0 because
Suppose true for n. Then from Claim 1,
Claim 3: An ≥ 0
Proof of Claim 3: This is true if n = 0. Suppose it is true for n.
because An − An2
0 by Proposition 24.6.2
is a sequence of positive self adjoint operators which are bounded above by
such that each of these operators commutes with every operator which commutes with
. By Lemma 24.6.3
, there exists a bounded linear operator, A
such that for all
Then A commutes with every operator which commutes with T because each An has this
property. Also A is a positive operator because each An is. From passing to the limit in the
definition of An,
and so Tx = A2x. This proves the theorem in the case that T ≤ I.
In the general case, consider T∕
and so T∕
Therefore, it has a square root, B.
has all the
right properties and A2
This proves the existence part of the
Next suppose both A and B are square roots of T having all the properties stated in the
theorem. Then AB = BA because both A and B commute with every operator which
commutes with T.
Therefore, on adding these,
It follows both expressions in 24.32
equal 0 since both are nonnegative and when they are
added the result is 0. Now applying the existence part of the theorem to A,
there exists a
positive square root of A
which is self adjoint. Thus
= 0 which implies A
these and taking the inner product with x,
and so Ax = Bx which shows A = B since x was arbitrary. ■
class=”left” align=”middle”(U)24.7. ORDINARY DIFFERENTIAL EQUATIONS IN