400 APPENDIX C. FUNCTIONS OF MATRICES

where the columns of M are as follows from left to right,sin 4

12 sin 4−

12 sin 2

0

− 12 sin 4 +

12 sin 2

 ,

sin 4− sin 2− cos 2

12 sin 4 +

32 sin 2− 2 cos 2

− cos 2

− 12 sin 4−

12 sin 2 + 3 cos 2

 ,

− cos 2

sin 2

sin 2− cos 2

cos 2− sin 2



sin 4− sin 2− cos 212 sin 4 +

12 sin 2− 2 cos 2

− cos 2

− 12 sin 4 +

12 sin 2 + 3 cos 2

 .

Perhaps this isn’t the first thing you would think of. Of course the ability to get this niceclosed form description of sin (A) was dependent on being able to find the Jordan form alongwith a similarity transformation which will yield the Jordan form.

The following corollary is known as the spectral mapping theorem.

Corollary C.0.3 Let A be an n× n matrix and let ρ (A) < R where for |λ| < R,

f (λ) =

∞∑n=0

anλn.

Then f (A) is also an n×n matrix and furthermore, σ (f (A)) = f (σ (A)) . Thus the eigen-values of f (A) are exactly the numbers f (λ) where λ is an eigenvalue of A. Furthermore,the algebraic multiplicity of f (λ) coincides with the algebraic multiplicity of λ.

All of these things can be generalized to linear transformations defined on infinite di-mensional spaces and when this is done the main tool is the Dunford integral along withthe methods of complex analysis. It is good to see it done for finite dimensional situationsfirst because it gives an idea of what is possible. Actually, some of the most interestingfunctions in applications do not come in the above form as a power series expanded about0. One example of this situation has already been encountered in the proof of the rightpolar decomposition with the square root of an Hermitian transformation which had allnonnegative eigenvalues. Another example is that of taking the positive part of an Hermi-tian matrix. This is important in some physical models where something may depend onthe positive part of the strain which is a symmetric real matrix. Obviously there is no wayto consider this as a power series expanded about 0 because the function f (r) = r+ is noteven differentiable at 0. Therefore, a totally different approach must be considered. Firstthe notion of a positive part is defined.

Definition C.0.4 Let A be an Hermitian matrix. Thus it suffices to consider A as anelement of L (Fn,Fn) according to the usual notion of matrix multiplication. Then thereexists an orthonormal basis of eigenvectors, {u1, · · · ,un} such that

A =

n∑j=1

λjuj ⊗ uj ,

for λj the eigenvalues of A, all real. Define

A+ ≡n∑

j=1

λ+j uj ⊗ uj

where λ+ ≡ |λ|+λ2 .