1912 CHAPTER 59. BASIC PROBABILITY

Also, if X∼ Np (m,Σ) then −X∼ Np (−m,Σ) . Furthermore, if X∼ Np (m,Σ) then

E(eit·X)= eit·me−

12 t∗Σt (59.16.32)

Also if a is a constant and X∼ Np (m,Σ) then aX∼ Np(am,a2Σ

).

Proof: Consider E(eit·X) for X∼ Np (m,Σ).

E(eit·X)≡ 1

(2π)p/2 (detΣ)1/2

∫Rp

eit·xe−12 (x−m)∗Σ−1(x−m)dx.

Let R be an orthogonal transformation such that

RΣR∗ = D = diag(σ

21, · · · ,σ2

p).

Then let R(x−m) = y. Then

E(eit·X)= 1

(2π)p/2∏

pi=1 σ i

∫Rp

eit·(R∗y+m)e−12 y∗D−1ydy.

ThereforeE(eit·X)= 1

(2π)p/2∏

pi=1 σ i

∫Rp

eis·(y+Rm)e−12 y∗D−1ydy

where s =Rt. This equals

eit·mp

∏i=1

(∫R

eisiyie− 1

2σ2i

y2idyi

)1√

2πσ i

= eit·mp

∏i=1

(∫R

eisiσ iue−12 u2

du)

1√2π

= eit·mp

∏i=1

e−12 s2

i σ2i

1√2π

∫R

e−12 (u−isiσ i)

2du

= eit·me−12 ∑

pi=1 s2

i σ2i = eit·me−

12 t∗Σt

This proves 59.16.32.Since X1 and X2 are independent, eit·X1 and eit·X2 are also independent. Hence

E(eit·X1+X2

)= E

(eit·X1

)E(eit·X2

).

Thus,

E(eit·X1+X2

)= E

(eit·X1

)E(eit·X2

)= eit·m1e−

12 t∗Σ1teit·m2e−

12 t∗Σ2t

= eit·(m1+m2)e−12 t∗(Σ1+Σ2)t

1912 CHAPTER 59. BASIC PROBABILITYAlso, if X ~ Np (m,Z) then —X ~ Ny (—m, x). Furthermore, if X ~ N, (m,xX) thenE (eX) = eitme 30 (59.16.32)Also if a is a constant and X ~ N, (m,) then aX ~ N, (am,a’2).Proof: Consider E (e**) for X ~ N, (m, ).. 1 P 1 +pE eit X | eit X— 3 (xm) "E (x-m) Dy.(en) (2m)?/? (detd)!/? JeLet R be an orthogonal transformation such thatRER* = D = diag (07,-+- ,0%).Then let R(x —m) = y. ThenE (etX _ tL elt (R'y+m) o—3y'D"lY gy,(2m)?"? TP, 7 JRPi=Therefore ;2m)°'"T];_, 0; 7B?where s =Rt. This equalsp . —-, 1eitm Il | ebdidie 207°! dy;i=t WR V 210;. p . 12 1_ eitm [ eliPiM op 7U au)I] ( R V2P := etme sie? 4 [ ooh u-ision? gyV2 JRi=1‘ lyP 262 i 1 4xetm o— 3 Li-1 Si OF eitm,—st xtThis proves 59.16.32.Since X, and X are independent, e’t*! and e't%2 are also independent. HenceE (et X1+X2) —f (e*™ ) E (et ®2) .Thus,E (et Xi +X) — £E (e*™ ) E (et ®2). . Ioitm eo 2t Sit itm, ,— Ft*Eotoit-(m)+mp) ,—5t"(Z+Z2)t