776 CHAPTER 28. THE NORMAL DISTRIBUTION

these measures µh1···hm. To determine whether this is so, take the characteristic function of

ν . Let Σ1 be the n×n matrix which comes from the {k1 · · ·kn} and let Σ2 be the one whichcomes from the {h1 · · ·hm}.∫

Rmeit·xdν (x) ≡

∫Rn−m

∫Rm

ei(t,0)·(x,y)dµk1···kn(x,y)

= e−12 (t∗,0∗)Σ1(t,0) = e−

12 t∗Σ2t

which is the characteristic function for µh1···hm. Therefore, these two measures are the

same and the Kolmogorov consistency condition holds. It follows from The Kolmogorovextension theorem Theorem 20.3.3 that there exists a measure µ defined on the Borel setsof ∏h∈HR which extends all of these measures. This argument also shows that if a randomvector X has characteristic function e−

12 t∗Σt, then if Xk is one of its components, then the

characteristic function of Xk is e−12 t2|hk|2so this scalar valued random variable has mean

zero and variance |hk|2. Then if ω ∈ ∏h∈HR, W (h)(ω) ≡ πh (ω) where πh denotes theprojection onto position h in this product. Also define

(W ( f1) ,W ( f2) , · · · ,W ( fn))≡ π f1··· fn (ω)

Then this is a random variable whose covariance matrix is just Σi j = ( fi, f j)H and whosecharacteristic equation is e−

12 t∗Σt so this verifies that

(W ( f1) ,W ( f2) , · · · ,W ( fn))

is normally distributed with covariance Σ. If you have two of them, W (g) ,W (h) , thenE (W (h)W (g)) = (h,g)H . This follows from what was just shown that (W ( f ) ,W (g)) isnormally distributed and so the covariance will be(

| f |2 ( f ,g)( f ,g) |g|2

)=

 E(

W ( f )2)

E (W ( f )W (g))

E (W ( f )W (g)) E(

W (g)2) 

Finally consider the claim about independence. Any finite subset of {W (ei)} is gener-alized normal with the covariance matrix being a diagonal. Therefore,

(W (ei1) , · · · ,W (ein))

is normally distributed with covariance a diagonal matrix so by Theorem 28.2.3, the randomvariables {W (ei)} are independent. ■

28.5 The Central Limit TheoremThe central limit theorem is one of the most marvelous theorems in mathematics. It can beproved through the use of characteristic functions. Recall for x ∈ Rp,

∥x∥∞≡max

{∣∣x j∣∣ , j = 1, · · · , p

}.

Also recall the definition of the distribution function for a random vector,X .

FX (x)≡ P(X j ≤ x j, j = 1, · · · , p) .

How can you tell if a sequence of random vectors with values in Rp is tight? The nextlemma gives a way to do this. It is Lemma 28.4.3. I am stating it here for convenience.

7716 CHAPTER 28. THE NORMAL DISTRIBUTIONthese measures /,, ...;,,- To determine whether this is so, take the characteristic function ofv. Let L; be the n x n matrix which comes from the {k; ---k,} and let Xz be the one whichcomes from the {/1---Am}.[ettavi) = he [petted ty oy 9)_ 4 (#*,0*)Z)(t,0) _ ,—4t*2ot=ewhich is the characteristic function for u yh" Therefore, these two measures are thesame and the Kolmogorov consistency condition holds. It follows from The Kolmogorovextension theorem Theorem 20.3.3 that there exists a measure y defined on the Borel setsof [],c7# R which extends all of these measures. This argument also shows that if a randomoe : lye : . :vector X has characteristic function e~ 2* xt then if X; is one of its components, then thewas . . L 2h, 2 . ;characteristic function of X; is e~ 2! lx!" s0 this scalar valued random variable has meanzero and variance |h,|”. Then if © € []j,<4R, W (h) (@) = 2, (@) where 7), denotes theprojection onto position h in this product. Also define(W (fi) Wf). W(fr)) = TM fifa (@)Then this is a random variable whose covariance matrix is just Lj; = (fj, fj);, and whosewa: ae ercharacteristic equation is e~2* ~* so this verifies that(W (fi) Wf). /W (fn)is normally distributed with covariance L. If you have two of them, W(g),W (A), thenE (W (h) W (g)) = (h,g),,- This follows from what was just shown that (W (f) ,W (g)) isnormally distributed and so the covariance will befe Of.e) ) _{ = (wir?) EW (W(s))(fg) |gl? E(W(f)W(g)) E (w (s)°)Finally consider the claim about independence. Any finite subset of {W (e;)} is gener-alized normal with the covariance matrix being a diagonal. Therefore,(W (ei,),--+ ,W (ei, ))is normally distributed with covariance a diagonal matrix so by Theorem 28.2.3, the randomvariables {W (e;)} are independent. ll28.5 The Central Limit TheoremThe central limit theorem is one of the most marvelous theorems in mathematics. It can beproved through the use of characteristic functions. Recall for x € R?,\|ar||.. = max {|x;| Jal Pp}.Also recall the definition of the distribution function for a random vector, X.How can you tell if a sequence of random vectors with values in R? is tight? The nextlemma gives a way to do this. It is Lemma 28.4.3. I am stating it here for convenience.