59.16. THE MULTIVARIATE NORMAL DISTRIBUTION 1915
Corollary 59.16.5 Let X=(X1, · · · ,Xp) ,Y= (Y1, · · · ,Yp) where each Xi,Yi is a real valuedrandom variable. Suppose also that for every a ∈ Rp, a ·X and a ·Y are both normallydistributed with the same mean and variance. Then X and Y are both multivariate normalrandom vectors with the same mean and variance.
Proof: In the Proof of Theorem 59.16.4 the proof implies that the characteristic func-tions of a ·X and a ·Y are both of the form
eitme−12 σ2t2
.
Then as in the proof of that theorem, it must be the case that
m =p
∑j=1
a jm j
where E (Xi) = mi = E (Yi) and
σ2 = a∗E
((X−m)(X−m)∗
)a
= a∗E((Y−m)(Y−m)∗
)a
and this last equation must hold for every a. Therefore,
E((X−m)(X−m)∗
)= E
((Y−m)(Y−m)∗
)≡ Σ
and so the characteristic function of both X and Y is eis·me−12 s∗Σs as in the proof of Theorem
59.16.4.
Theorem 59.16.6 Suppose X = (X1, · · · ,Xp) is normally distributed with mean m and co-variance Σ. Then if X1 is uncorrelated with any of the Xi, meaning
E ((X1−m1)(X j−m j)) = 0 for j > 1,
then X1 and (X2, · · · ,Xp) are both normally distributed and the two random vectors areindependent. Here m j ≡ E (X j) . More generally, if the covariance matrix is a diagonalmatrix, the random variables,
{X1, · · · ,Xp
}are independent.
Proof: From Theorem 59.16.2
Σ = E((X−m)(X−m)∗
).
Then by assumption,
Σ =
(σ2
1 00 Σp−1
). (59.16.35)
I need to verify that if E ∈ σ (X1) and F ∈ σ (X2, · · · ,Xp) , then
P(E ∩F) = P(E)P(F) .