61.7. GAUSSIAN MEASURES 2015
61.7.2 Fernique’s TheoremThe following is an interesting lemma.
Lemma 61.7.4 Suppose µ is a symmetric Gaussian measure on the real separable Ba-nach space, E. Then there exists a probability space, (Ω,F ,P) and independent randomvariables, X and Y mapping Ω to E such that L (X) = L (Y ) = µ. Also, the two randomvariables,
1√2(X−Y ) ,
1√2(X +Y )
are independent and
L
(1√2(X−Y )
)= L
(1√2(X +Y )
)= µ.
Proof: Letting X ′ ≡ 1√2(X +Y ) and Y ′ ≡ 1√
2(X−Y ) , it follows from Theorem 59.13.2
on Page 1898, that X ′ and Y ′ are independent if whenever h1, · · · ,hm ∈ E ′ and g1, · · · ,gk ∈E ′, the two random vectors,(
h1 ◦X ′, · · · ,hm ◦X ′)
and(g1 ◦Y ′, · · · ,gk ◦Y ′
)are independent. Now consider linear combinations
m
∑j=1
t jh j ◦X ′+k
∑i=1
sigi ◦Y ′.
This equals
1√2
m
∑j=1
t jh j (X)+1√2
m
∑j=1
t jh j (Y )
+1√2
k
∑i=1
sigi (X)− 1√2
k
∑i=1
sigi (Y )
=1√2
(m
∑j=1
t jh j +k
∑i=1
sigi
)(X)
+1√2
(m
∑j=1
t jh j−k
∑i=1
sigi
)(Y )
and this is the sum of two independent normally distributed random variables so it is alsonormally distributed. Therefore, by Theorem 61.6.4(
h1 ◦X ′, · · · ,hm ◦X ′,g1 ◦Y ′, · · · ,gk ◦Y ′)
is a random variable with multivariate normal distribution and by Theorem 61.6.9 the tworandom vectors (
h1 ◦X ′, · · · ,hm ◦X ′)
and(g1 ◦Y ′, · · · ,gk ◦Y ′
)