2016 CHAPTER 61. PROBABILITY IN INFINITE DIMENSIONS

are independent ifE((

hi ◦X ′)(

g j ◦Y ′))

= 0

for all i, j. This is what I will show next.

E((

hi ◦X ′)(

g j ◦Y ′))

=14

E ((hi (X)+hi (Y ))(g j (X)−g j (Y )))

=14

E (hi (X)g j (X))− 14

E (hi (X)g j (Y ))

+14

E (hi (Y )g j (X))− 14

E (hi (Y )g j (Y )) (61.7.25)

Now from the above observation after the definition of Gaussian measure hi (X)g j (X)and hi (Y )g j (Y ) are both in L1 because each term in each product is normally distributed.Therefore, by Lemma 59.15.2,

E (hi (X)g j (X)) =∫

hi (Y )g j (Y )dP

=∫

Ehi (y)g j (y)dµ

=∫

hi (X)g j (X)dP

= E (hi (Y )g j (Y ))

and so 61.7.25 reduces to

14(E (hi (Y )g j (X)−hi (X)g j (Y ))) = 0

because hi (X) and g j (Y ) are independent due to the assumption that X and Y are indepen-dent. Thus

E (hi (X)g j (Y )) = E (hi (X))E (g j (Y )) = 0

due to the assumption that µ is symmetric which implies the mean of these random vari-ables equals 0. The other term works out similarly. This has proved the independence ofthe random variables, X ′ and Y ′.

Next consider the claim they have the same law and it equals µ . To do this, I will useTheorem 59.12.9 on Page 1897. Thus I need to show

E(

eih(X ′))= E

(eih(Y ′)

)= E

(eih(X)

)(61.7.26)

for all h∈ E ′. Pick such an h. Then h◦X is normally distributed and has mean 0. Therefore,for some σ ,

E(

eith◦X)= e−

12 t2σ2

.

2016 CHAPTER 61. PROBABILITY IN INFINITE DIMENSIONSare independent ifE ((moX") (g:0¥')) =0for all i, 7. This is what I will show next.E ((hiox') (gjo¥’))_ GE ((ls(X) +i (¥)) (9p (X) — 94 (¥)))= FE (hi(X) gy) (X)) — TE (hi(X)93(¥))+36 (i: (Y) 2; (X)) — 3 (i: (Y) @,(Y)) (61.7.25)Now from the above observation after the definition of Gaussian measure hj (X) gj (X)and h; (Y)g;(¥) are both in L' because each term in each product is normally distributed.Therefore, by Lemma 59.15.2,E(hi(X)8/(X)) = | h(v)si()aP= [rosinEhy (X) gj(X)dPQE(hi(Y) 8; (¥))and so 61.7.25 reduces to5 (E (hi (¥)97(X) —hi(X).93(¥))) =0because h; (X) and g;(Y) are independent due to the assumption that X and Y are indepen-dent. ThusE (hi(X) gj (V)) = E (hi (X)) E (gj (V)) =0due to the assumption that 1 is symmetric which implies the mean of these random vari-ables equals 0. The other term works out similarly. This has proved the independence ofthe random variables, X’ and Y’.Next consider the claim they have the same law and it equals u. To do this, I will useTheorem 59.12.9 on Page 1897. Thus I need to showE (en) ~E (en) ~E (en) (61.7.26)for all h € E’. Pick such an h. Then hoX is normally distributed and has mean 0. Therefore,for some o,E (et) — pio