28.2. LINEAR COMBINATIONS 763

Conversely, suppose every linear combination is normally distributed. Next suppose∑

pj=1 a jX j = a ·X is normally distributed with mean µ and variance σ2 so that its charac-

teristic function is given as eitµ e−12 t2σ2

. I will now relate µ and σ2 to various quantitiesinvolving the X j. Letting m j = E (X j) ,m= (m1, · · · ,mp)

µ =p

∑j=1

a jE (X j) =p

∑j=1

a jm j, σ2 = E

( p

∑j=1

a jX j−p

∑j=1

a jm j

)2

= E

( p

∑j=1

a j (X j−m j)

)2= ∑

j,ka jakE ((X j−m j)(Xk−mk))

It follows the mean of the random variable, a ·X is µ = ∑ j a jm j = a ·m and its varianceis

σ2 = a∗E

((X−m)(X−m)∗

)a

Therefore, E(eita·X) = eitµ e−

12 t2σ2

= eita·me−12 t2a∗E((X−m)(X−m)∗)a. Letting s = ta

this showsE(eis·X)= eis·me−

12s∗E((X−m)(X−m)∗)s = eis·me−

12s∗Σs

which is the characteristic function of a normally distributed random variable withm givenabove and Σ given by

Σ jk = E ((X j−m j)(Xk−mk)) .

By assumption, a is completely arbitrary and so it follows that s is also. Hence, X isnormally distributed as claimed. ■

Corollary 28.2.2 Let X = (X1, · · · ,Xp) ,Y = (Y1, · · · ,Yp) where each Xi,Yi is a realvalued random variable. Suppose also that for every a∈Rp, a ·X and a ·Y are both nor-mally distributed with the same mean and variance. Then X and Y are both multivariatenormal random vectors with the same mean and variance.

Proof: In the Proof of Theorem 28.2.1 the proof implies that the characteristic functionsof a ·X and a ·Y are both of the form eitme−

12 σ2t2

. Then as in the proof of that theorem, itmust be the case that m = ∑

pj=1 a jm j where E (Xi) = mi = E (Yi) and

σ2 = a∗E

((X−m)(X−m)∗

)a= a∗E

((Y −m)(Y −m)∗

)a

and this last equation must hold for every a. Therefore,

E((X−m)(X−m)∗

)= E

((Y −m)(Y −m)∗

)≡ Σ

and so the characteristic function of both X and Y is eis·me−12s∗Σs as in the proof of

Theorem 28.2.1. ■

Theorem 28.2.3 Suppose X = (X1, · · · ,Xp) is normally distributed with mean mand covariance Σ. Then if X1 is uncorrelated with any of the Xi, meaning

E ((X1−m1)(X j−m j)) = 0 for j > 1,

then X1 and (X2, · · · ,Xp) are both normally distributed and the two random vectors areindependent. Here m j ≡ E (X j) . More generally, if the covariance matrix is a diagonalmatrix, the random variables,

{X1, · · · ,Xp

}are independent.

28.2. LINEAR COMBINATIONS 763Conversely, suppose every linear combination is normally distributed. Next supposevie a;X; =a-X is normally distributed with mean and variance 07 so that its charac-os re tu,—41202 . . ws:teristic function is given as e“#e~2"° . I will now relate and o7 to various quantitiesinvolving the X;. Letting m; = E (X;),m = (mj,--- ,mp)*2P P P PWo = YajE(X}) = Y ajmj, 0° =E (s ajXj—)) om)jal j=l j=lJal2° [Zes0-m] = YajacE ((Xj — mj) (Xe —me))jkIt follows the mean of the random variable, a-X is LW = Yi ajmj; =a-m and its varianceiSo° =a*E ((X—m)(X—m)"*)a. tute i _1a*E((x— _m)* ,Therefore, E (e"**) = eM@e 200 = eliame ah 4 E((X—m)\(X—m)")a_ 7 otting s = tathis showsE (ci**) = ism —58*E((X—m)(X—m)*)s =. I oxe ism, 78° Lsewhich is the characteristic function of a normally distributed random variable with m givenabove and © given byYin = E ((Xj — mj) (Xe —mk)).By assumption, a is completely arbitrary and so it follows that s is also. Hence, X isnormally distributed as claimed. MfCorollary 28.2.2 Let X = (X1,---,X»),Y = (%,--+,¥p) where each X;,Y; is a realvalued random variable. Suppose also that for every a € R?, a- X and a-¥ are both nor-mally distributed with the same mean and variance. Then X and Y are both multivariatenormal random vectors with the same mean and variance.Proof: In the Proof of Theorem 28.2.1 the proof implies that the characteristic functionsof a- X anda-Y are both of the form e!e-2°”. Then as in the proof of that theorem, itmust be the case that m= Y'_ ajm; where E (X;) = m; = E (¥;) ando° =a*E ((X—m)(X —m)*)a=a*E((Y—m)(Y—m)*)aand this last equation must hold for every a. Therefore,E ((X —m)(X—m)*) =E((Y —m)(Y—m)*) =zoe : : ; 1 geand so the characteristic function of both X and Y is e!®™e—2* =Theorem 28.2.1. Hfas in the proof ofTheorem 28.2.3 Suppose X = (X,--- ,Xp) is normally distributed with mean mand covariance &. Then if X1 is uncorrelated with any of the X;, meaningE ((X; —m,) (X;—mj)) =0 for j > 1,then X, and (X2,--- ,Xp) are both normally distributed and the two random vectors areindependent. Here m; = E (Xj). More generally, if the covariance matrix is a diagonalmatrix, the random variables, {X| yo Xp} are independent.