59.17. USE OF CHARACTERISTIC FUNCTIONS TO FIND MOMENTS 1917
and (X2, · · · ,Xp) is also normally distributed with mean m′ and covariance Σp−1. Now from59.16.37, 59.16.36 follows. In case the covariance matrix is diagonal, the above reasoningextends in an obvious way to prove the random variables,
{X1, · · · ,Xp
}are independent.
However, another way to prove this is to use Proposition 59.11.1 on Page 1891 andconsider the characteristic function. Let E (X j) = m j and
P =p
∑j=1
t jX j.
Then since X is normally distributed and the covariance is a diagonal,
D≡
σ21 0
. . .0 σ2
p
,
E(eiP) = E
(eit·X)= eit·me−
12 t∗Σt
= exp
(p
∑j=1
it jm j−12
t2j σ
2j
)(59.16.39)
=p
∏j=1
exp(
it jm j−12
t2j σ
2j
)Also,
E(eit jX j
)= E
(exp
(it jX j + ∑
k ̸= ji0Xk
))
= exp(
it jm j−12
t2j σ
2j
)With 59.16.39, this shows
E(eiP)= p
∏j=1
E(eit jX j
)which shows by Proposition 59.11.1 that the random variables,{
X1, · · · ,Xp}
are independent.
59.17 Use Of Characteristic Functions To Find MomentsLet X be a random variable with characteristic function
φ X (t)≡ E (exp(itX))