2016 CHAPTER 61. PROBABILITY IN INFINITE DIMENSIONS
are independent ifE((
hi ◦X ′)(
g j ◦Y ′))
= 0
for all i, j. This is what I will show next.
E((
hi ◦X ′)(
g j ◦Y ′))
=14
E ((hi (X)+hi (Y ))(g j (X)−g j (Y )))
=14
E (hi (X)g j (X))− 14
E (hi (X)g j (Y ))
+14
E (hi (Y )g j (X))− 14
E (hi (Y )g j (Y )) (61.7.25)
Now from the above observation after the definition of Gaussian measure hi (X)g j (X)and hi (Y )g j (Y ) are both in L1 because each term in each product is normally distributed.Therefore, by Lemma 59.15.2,
E (hi (X)g j (X)) =∫
Ω
hi (Y )g j (Y )dP
=∫
Ehi (y)g j (y)dµ
=∫
Ω
hi (X)g j (X)dP
= E (hi (Y )g j (Y ))
and so 61.7.25 reduces to
14(E (hi (Y )g j (X)−hi (X)g j (Y ))) = 0
because hi (X) and g j (Y ) are independent due to the assumption that X and Y are indepen-dent. Thus
E (hi (X)g j (Y )) = E (hi (X))E (g j (Y )) = 0
due to the assumption that µ is symmetric which implies the mean of these random vari-ables equals 0. The other term works out similarly. This has proved the independence ofthe random variables, X ′ and Y ′.
Next consider the claim they have the same law and it equals µ . To do this, I will useTheorem 59.12.9 on Page 1897. Thus I need to show
E(
eih(X ′))= E
(eih(Y ′)
)= E
(eih(X)
)(61.7.26)
for all h∈ E ′. Pick such an h. Then h◦X is normally distributed and has mean 0. Therefore,for some σ ,
E(
eith◦X)= e−
12 t2σ2
.