27.2. CONDITIONAL PROBABILITY 739

Proof: First let E be a Borel set in Rk. From the definition,

λg(X1,··· ,Xn) (E) = P(g (X1, · · · ,Xn) ∈ E)

= P((X1, · · · ,Xn) ∈ g−1 (E)

)= λ (X1,··· ,Xn)

(g−1 (E)

)∫Rk

XEdλg(X1,··· ,Xn) =∫Rp1×···×Rpn

Xg−1(E)dλ (X1,··· ,Xn)

=∫Rp1×···×Rpn

XE (g (x1, · · · ,xn))dλ (X1,··· ,Xn).

This proves 27.3 in the case when h is XE . To prove it in the general case, approximatethe nonnegative Borel measurable function with simple functions for which the formula istrue, and use the monotone convergence theorem.

It remains to prove the last assertion that functions of independent random vectors arealso independent random vectors. Let E be a Borel set in Rk1 ×·· ·×Rkn . Then for

π i (x1, · · · ,xn)≡ xi,∫Rk1×···×Rkn

XEdλ (g1(X1),··· ,gn(Xn))

≡∫Rp1×···×Rpn

XE ◦ (g1 ◦π1, · · · ,gn ◦πn)dλ (X1,··· ,Xn)

=∫Rp1· · ·∫Rpn

XE ◦ (g1 ◦π1, · · · ,gn ◦πn)dλXn · · ·dλX1

=∫Rk1· · ·∫Rkn

XEdλgn(Xn) · · ·dλg1(X1) ■

Of course if Xi, i = 1,2, ...,n are independent, this means the σ algebras σ (Xi) are indepen-dent. Now σ (gi ◦Xi)⊆ σ (Xi) because

(gi ◦Xi)−1 (Borel set) = X−1

i(g−1

i (Borel set))= X−1

i (Borel set) ∈ σ (Xi)

and so the variables gi ◦Xi, i = 1,2, ...,n are independent. I think this is a more direct wayof seeing this second claim.

Proposition 27.2.5 Let ν1, · · · ,νn be Radon probability measures defined onRp. Thenthere exists a probability space and independent random vectors

{X1, · · · ,Xn}

defined on this probability space such that λX i = ν i.

Proof: Let (Ω,S ,P) ≡ ((Rp)n ,S1×·· ·×Sn,ν1×·· ·×νn) where this is just theproduct σ algebra and product measure which satisfies the following for measurable rect-angles.

(ν1×·· ·×νn)

(n

∏i=1

Ei

)=

n

∏i=1

ν i (Ei).

Now letX i (x1, · · · ,xi, · · · ,xn) = xi.

27.2. CONDITIONAL PROBABILITY 739Proof: First let E be a Borel set in R*. From the definition,Ag(X1.X,)(E) = P(g(X1,--+,Xn) €E)= P((X1,--+,Xn) Eg | (E)) =A, Xp) (g | (E))I, Red g(x -,Xp) = Fon om Ra \(E)IM (XX p)= | ZE (g(@1,°-* ,&n))dA(x,,....x,)-R?1 xx RenThis proves 27.3 in the case when h is 2g. To prove it in the general case, approximatethe nonnegative Borel measurable function with simple functions for which the formula istrue, and use the monotone convergence theorem.It remains to prove the last assertion that functions of independent random vectors arealso independent random vectors. Let E be a Borel set in R! x --- x R&. Then forMj (x1,°°° ,Ln) = Zi,[, xx IRKn Redh g(x). :Gn(Xn))Dv aye Bee 9 Ha) ADR?1 x-+xRPn/ oe RE O(G,°M15°** Gn Mn) drx,-drAx,R11 IRPn[, fg, PEM G(Xn) Ag (x1)Of course if X;,i= 1,2,..., are independent, this means the o algebras o (X;) are indepen-dent. Now o (g;0 X;) C o (X;) because(g:0X;) | (Borel set) = X;"! (g; | (Borel set)) = X;"' (Borel set) € o (Xj)and so the variables g; 0 X;, i= 1,2,...,n are independent. I think this is a more direct wayof seeing this second claim.Proposition 27.2.5 Let v,--- ,V, be Radon probability measures defined on R”. Thenthere exists a probability space and independent random vectors{X1, ue Xn}defined on this probability space such that 0.x, = Vi.Proof: Let (Q,.7%,P) = ((R’)",.A X--+X %y,Vi X+++X Vn) where this is just theproduct o algebra and product measure which satisfies the following for measurable rect-angles.(v1 XX Vn) (TL) [viteNow let Xj (a@1,°°+ ,&j,°°+ ,@n) = Vj.