28.1. THE MULTIVARIATE NORMAL DISTRIBUTION 761
Proof: Consider E(eit·X) forX ∼ Np (m,Σ).
E(eit·X)≡ 1
(2π)p/2 (detΣ)1/2
∫Rp
eit·xe−12 (x−m)∗Σ−1(x−m)dx.
Let R be an orthogonal transformation such that
RΣR∗ = D = diag(σ
21, · · · ,σ2
p).
Let R(x−m) = y. Then
E(eit·X)= 1
(2π)p/2∏
pi=1 σ i
∫Rp
eit·(R∗y+m)e−12y∗D−1ydy.
ThereforeE(eit·X)= 1
(2π)p/2∏
pi=1 σ i
∫Rp
eis·(y+Rm)e−12y∗D−1ydy
where s= Rt. This equals
eit·mp
∏i=1
(∫R
eisiyie− 1
2σ2i
y2idyi
)1√
2πσ i
= eit·mp
∏i=1
(∫R
eisiσ iue−12 u2
du)
1√2π
= eit·mp
∏i=1
e−12 s2
i σ2i
1√2π
∫R
e−12 (u−isiσ i)
2du
By Lemma 28.0.1, this equals eit·me−12 ∑
pi=1 s2
i σ2i = eit·me−
12 t∗Σt. This proves 28.2.
SinceX1 andX2 are independent, eit·X1 and eit·X2 are also independent. Hence
E(eit·X1+X2
)= E
(eit·X1
)E(eit·X2
).
Thus,
E(eit·X1+X2
)= E
(eit·X1
)E(eit·X2
)= eit·m1e−
12 t∗Σ1teit·m2e−
12 t∗Σ2t
= eit·(m1+m2)e−12 t∗(Σ1+Σ2)t,
which, as shown above is the characteristic function of a random vector distributed asNp (m1 +m2,Σ1 +Σ2). Now it follows that X1 +X2 ∼ Np (m1 +m2,Σ1 +Σ2) by The-orem 27.1.4. This proves 28.1.
The assertion about −X is also easy to see because
E(
eit·(−X))
= E(
ei(−t)·X)
=1
(2π)p/2 (detΣ)1/2
∫Rp
ei(−t)·xe−12 (x−m)∗Σ−1(x−m)dx
=1
(2π)p/2 (detΣ)1/2
∫Rp
eit·xe−12 (x+m)∗Σ−1(x+m)dx