764 CHAPTER 28. THE NORMAL DISTRIBUTION

Proof: From Theorem 28.1.2 Σ = E((X−m)(X−m)∗

). Then by assumption,

Σ =

(σ2

1 00 Σp−1

). (28.3)

I need to verify that if E ∈ σ (X1) and F ∈ σ (X2, · · · ,Xp) , then P(E ∩F) = P(E)P(F) .

Let E = X−11 (A) and F = (X2, · · · ,Xp)

−1 (B) where A and B are Borel sets in R andRp−1 respectively. Thus I need to verify that

P([(X1,(X2, · · · ,Xp)) ∈ (A,B)]) =

µ(X1,(X2,··· ,Xp)) (A×B) = µX1(A)µ(X2,··· ,Xp) (B) . (28.4)

Using 28.3, Fubini’s theorem, and definitions,

µ(X1,(X2,··· ,Xp)) (A×B) =∫Rp

XA×B (x)1

(2π)p/2 det(Σ)1/2 e−12 (x−m)∗Σ−1(x−m)dx

=∫R

XA (x1)∫Rp−1

XB (X2, · · · ,Xp) ·

1

(2π)(p−1)/2√2π(σ2

1

)1/2 det(Σp−1)1/2

e−(x1−m1)

2

2σ21 ·

e−12 (x

′−m′)∗Σ−1p−1

(x′−m′

)dx′dx1

where x′ = (x2, · · · ,xp) andm′ = (m2, · · · ,mp) . Now this equals

∫R

XA (x1)1√

2πσ21

e−(x1−m1)

2

2σ21 ·

∫B

1

(2π)(p−1)/2 det(Σp−1)1/2 e

−12 (x

′−m′)∗Σ−1p−1

(x′−m′

)dx′dx. (28.5)

In case B = Rp−1, the inside integral equals 1 and

µX1(A) = µ(X1,(X2,··· ,Xp))

(A×Rp−1)= ∫

RXA (x1)

1√2πσ2

1

e−(x1−m1)

2

2σ21 dx1

which shows X1 is normally distributed as claimed. Similarly, letting A = R,

µ(X2,··· ,Xp) (B) = µ(X1,(X2,··· ,Xp)) (R×B)

=∫

B

1

(2π)(p−1)/2 det(Σp−1)1/2 e

−12 (x

′−m′)∗Σ−1p−1

(x′−m′

)dx′

and (X2, · · · ,Xp) is also normally distributed with mean m′ and covariance Σp−1. Nowfrom 28.5, 28.4 follows. In case the covariance matrix is diagonal, the above reasoningextends in an obvious way to prove the random variables,

{X1, · · · ,Xp

}are independent.

764 CHAPTER 28. THE NORMAL DISTRIBUTIONProof: From Theorem 28.1.2 £ = E ((X —m)(X —m)*). Then by assumption,2_ 07 0L= ( 0 S41 ). (28.3)I need to verify that if E € o (X;) and F € 0 (X2,--- ,X,), then P(ENF) =P(E)P(F).Let E = X;!(A) and F = (X2,---,Xp)~' (B) where A and B are Borel sets in R andIR?! respectively. Thus I need to verify thatP([(X1, (X2,-++ .Xp)) € (A, B)]) =E(x; ,(X2,-.Xp)) (A x B) = Hy, (A) H(x,,....X)) (B) : (28.4)Using 28.3, Fubini’s theorem, and definitions,H (x,,(X2,-=Xp)) A XB) =1 SS (@—m)*z7! (w—m) YXK, —__________e2x| Ky (x1) KB (X2,-++ Xp) +R Rp-!1 (x=)e(2m)! /? 3x (62)! det (Zp)_1)"/?Bem E(B) arta,where aw! = (x2,-+- ,xp) and m! = (mp,--- ,mp). Now this equals1 = (a/-m’)*r5! a _—m!h (Qn? VPaer(Sp a) mile adds. Q85)In case B = R?~!, the inside integral equals 1 and2at207 dx}Mx, (A) = Hx, (x9.--.x,)) (AX BP y= [ral x1) yoxo?,\2n07which shows Xj is normally distributed as claimed. Similarly, letting A = R,Hx xp) B) = E(x (x9-~.xp)) (Rx B)- |, F(a m 5h (2m) gyB (2m)? det (Z,_1)!/and (X,---,Xp) is also normally distributed with mean m’ and covariance X,—1. Nowfrom 28.5, 28.4 follows. In case the covariance matrix is diagonal, the above reasoningextends in an obvious way to prove the random variables, {X| oo Xp} are independent.