59.16. THE MULTIVARIATE NORMAL DISTRIBUTION 1915

Corollary 59.16.5 Let X=(X1, · · · ,Xp) ,Y= (Y1, · · · ,Yp) where each Xi,Yi is a real valuedrandom variable. Suppose also that for every a ∈ Rp, a ·X and a ·Y are both normallydistributed with the same mean and variance. Then X and Y are both multivariate normalrandom vectors with the same mean and variance.

Proof: In the Proof of Theorem 59.16.4 the proof implies that the characteristic func-tions of a ·X and a ·Y are both of the form

eitme−12 σ2t2

.

Then as in the proof of that theorem, it must be the case that

m =p

∑j=1

a jm j

where E (Xi) = mi = E (Yi) and

σ2 = a∗E

((X−m)(X−m)∗

)a

= a∗E((Y−m)(Y−m)∗

)a

and this last equation must hold for every a. Therefore,

E((X−m)(X−m)∗

)= E

((Y−m)(Y−m)∗

)≡ Σ

and so the characteristic function of both X and Y is eis·me−12 s∗Σs as in the proof of Theorem

59.16.4.

Theorem 59.16.6 Suppose X = (X1, · · · ,Xp) is normally distributed with mean m and co-variance Σ. Then if X1 is uncorrelated with any of the Xi, meaning

E ((X1−m1)(X j−m j)) = 0 for j > 1,

then X1 and (X2, · · · ,Xp) are both normally distributed and the two random vectors areindependent. Here m j ≡ E (X j) . More generally, if the covariance matrix is a diagonalmatrix, the random variables,

{X1, · · · ,Xp

}are independent.

Proof: From Theorem 59.16.2

Σ = E((X−m)(X−m)∗

).

Then by assumption,

Σ =

(σ2

1 00 Σp−1

). (59.16.35)

I need to verify that if E ∈ σ (X1) and F ∈ σ (X2, · · · ,Xp) , then

P(E ∩F) = P(E)P(F) .

59.16. THE MULTIVARIATE NORMAL DISTRIBUTION 1915Corollary 59.16.5 Let X = (X1,--- ,X»), Y= (Y%1,--- ,¥») where each X;,Y; is a real valuedrandom variable. Suppose also that for every a € R?, a-X and a-Y are both normallydistributed with the same mean and variance. Then X and Y are both multivariate normalrandom vectors with the same mean and variance.Proof: In the Proof of Theorem 59.16.4 the proof implies that the characteristic func-tions of a- X and a- Y are both of the formF 1.2.2um ,—5 Oteve 2 .Then as in the proof of that theorem, it must be the case thatPm= y a jm Jjj=lwhere E (X;) = m; = E (Y;) ando”? = a*E((X—m)(X—m)*)a= a°E((Y—m)(Y-m)‘)aand this last equation must hold for every a. Therefore,E ((X—m)(X—m)*) =£ ((Y—m)(Y—m)*) =- oe . . . 1 oxand so the characteristic function of both X and Y is e“™e7 2° *5 a59.16.4. Iis in the proof of TheoremTheorem 59.16.6 Suppose X = (X1,--- ,Xp) is normally distributed with mean m and co-variance X. Then if X, is uncorrelated with any of the X;, meaningE ((X; —m,) (X;—mj)) =0 for j > 1,then X, and (X2,--- ,Xp) are both normally distributed and the two random vectors areindependent. Here m; = E (Xj). More generally, if the covariance matrix is a diagonalmatrix, the random variables, {X| oo Xp} are independent.Proof: From Theorem 59.16.2x= E((X—m)(X—m)*).Then by assumption,s-(c% 9 (59.16.35)=o x.) 16.I need to verify that if E € o (X,) and F € o(X2,--- ,X,), thenP(ENF) =P(E)P(F).