59.21. POSITIVE DEFINITE FUNCTIONS, BOCHNER’S THEOREM 1937
Then this is a random variable whose covariance matrix is just Σi j = ( fi, f j)H and whosecharacteristic equation is e−
12 t∗Σt so this verifies that
(W ( f1) ,W ( f2) , · · · ,W ( fn))
is normally distributed with covariance Σ. If you have two of them, W (g) ,W (h) , thenE (W (h)W (g)) = (h,g)H . This follows from what was just shown that (W ( f ) ,W (g)) isnormally distributed and so the covariance will be(
| f |2 ( f ,g)( f ,g) |g|2
)=
E(
W ( f )2)
E (W ( f )W (g))
E (W ( f )W (g)) E(
W (g)2)
Finally consider the claim about independence. Any finite subset of {W (ei)} is gener-alized normal with the covariance matrix being a diagonal. Therefore, writing in terms ofthe distribution measures, this diagonal matrix allows the iterated integrals to be split apartand it follows that
E
(exp
(i
m
∑k=1
tkW (ek)
))=
m
∏k=1
exp(itkW (ek))
and so this follows from Proposition 59.11.1. Note that in this case, the covariance matrixwill not have zero determinant.
59.21 Positive Definite Functions, Bochner’s TheoremFirst here is a nice little lemma about matrices.
Lemma 59.21.1 Suppose M is an n×n matrix. Suppose also that
α∗Mα = 0
for all α ∈ Cn. Then M = 0.
Proof: Suppose λ is an eigenvalue for M and let α be an associated eigenvector.
0 = α∗Mα = α
∗λα = λα
∗α = λ |α|2
and so all the eigenvalues of M equal zero. By Schur’s theorem there is a unitary matrix Usuch that
M =U
0 ∗1. . .
0 0
U∗ (59.21.51)
where the matrix in the middle has zeros down the main diagonal and zeros below the maindiagonal. Thus
M∗ =U
0 0. . .
∗2 0
U∗