1876 CHAPTER 59. BASIC PROBABILITY

This is the same as{ω :

∣∣∣∣∣∣∣∣∣∣ l

∑j=k−1

X j (ω)

∣∣∣∣∣∣∣∣∣∣< 1/n,k ≥ m

}∈ σ

(∪∞

j=p−1σ (X j))

Thus∪∞

m=p∩l,k≥m {ω : ||Sk (ω)−Sl (ω)||< 1/n} ∈ σ(∪∞

j=p−1σ (X j))

and so the intersection for all p of these is a tail event. Then the intersection over all n ofthese tail events is a tail event.

From this it can be concluded that if you have a sequence of independent random vari-ables, {Xk} the set where it converges is either of probability 1 or probability 0. A similarconclusion holds for the set where the infinite sum of these random variables converges.This is stated in the next corollary. This incredible assertion is the next corollary.

Corollary 59.6.6 Let {Xk} be a sequence of random variables having values in a Banachspace. Then

limn→∞

Xn (ω)

either exists for a.e. ω or the convergence fails to take place for a.e. ω. Also if

A≡

{ω :

∑k=1

Xk (ω) converges

},

then P(A) = 0 or 1.

59.7 Kolmogorov’s InequalityKolmogorov’s inequality is a very interesting inequality which depends on independenceof a set of random vectors. The random vectors have values in Rn or more generally somereal separable Hilbert space.

Lemma 59.7.1 If Y,X are independent random variables having values in a real separableHilbert space, H with E

(|X|2

),E(|Y|2

)< ∞, then

∫Ω

(X,Y)dP =

(∫Ω

XdP,∫

YdP).

Proof: Let {ek} be a complete orthonormal basis. Thus∫Ω

(X,Y)dP =∫

∑k=1

(X,ek)(Y,ek)dP

Now

∫Ω

∑k=1|(X,ek)(Y,ek)|dP≤

∫Ω

(∑k|(X,ek)|

2

)1/2(∑k|(Y,ek)|

2

)1/2

dP

1876 CHAPTER 59. BASIC PROBABILITYThis is the same as{0Un=p km { : ||S.(@) — 8) (@)|| < 1/n} € o (Uj_p-16 (Xj)and so the intersection for all p of these is a tail event. Then the intersection over all n ofthese tail events is a tail event. 9From this it can be concluded that if you have a sequence of independent random vari-ables, {X;} the set where it converges is either of probability 1 or probability 0. A similarconclusion holds for the set where the infinite sum of these random variables converges.This is stated in the next corollary. This incredible assertion is the next corollary.< 1mm] € 0 (U_, 10 (Xj)j=k-1ThusCorollary 59.6.6 Let {X;} be a sequence of random variables having values in a Banachspace. Thenlim X,, (@)Nn—yooeither exists for a.e. @ or the convergence fails to take place for a.e. @. Also ifA= ° : y X; (@) comers ;k=1then P(A) =Oor 1.59.7 Kolmogorov’s InequalityKolmogorov’s inequality is a very interesting inequality which depends on independenceof a set of random vectors. The random vectors have values in R” or more generally somereal separable Hilbert space.Lemma 59.7.1 If Y,X are independent random variables having values in a real separableHilbert space, H with E (IxI’) ,E (iv/’) <0, then[i (X,Y) dP = ( [xan [ var).Proof: Let {e,} be a complete orthonormal basis. Thus[evar = [Yeo (Y,e,)aPNowk koo 1/2 1/2I Ll(%er) (Vex) aP < I (Zier) (Eimer) dP