63.4 An Example Of Martingales, Independent Increments
Here is an interesting lemma.
Lemma 63.4.1Let
(W (t),ℱt)
be a stochastic process which has independentincrements having values in E a real separable Banach space. Let
A ∈ ℱs ≡ σ (W (u)− W (r) : 0 ≤ r < u ≤ s)
Suppose g
(W (t) − W (s))
∈ L^{1}
(Ω; E)
. Then the following formula holds.
∫ ∫
ΩXAg (W (t)− W (s))dP = P (A ) Ωg (W (t)− W (s))dP (63.4.12)
(63.4.12)
Proof: Let G denote the set, of all A ∈ℱ_{s} such that 63.4.12 holds. Then it is obvious
G is closed with respect to complements and countable disjoint unions. Let K denote
those sets which are finite intersections of the form
A = ∩mi=1Ai
where each A_{i} is in a set of σ
(W (u) − W (r))
i i
for some 0 ≤ r_{i}< u_{i}≤ s. For such A, it
follows
A ∈ σ(W (ui)− W (ri),i = 1,⋅⋅⋅,m ).
Now consider the random vector having values in E^{m+1},
(W (u1)− W (r1),⋅⋅⋅,W (um )− W (rm),g (W (t)− W (s)))
Let t^{∗}∈
(E′)
^{m} and s^{∗}∈ E^{′}.
t∗ ⋅(W (u1)− W (r1),⋅⋅⋅,W (um )− W (rm))
can be written in the form g^{∗}⋅
(W (τ1)− W (η1),⋅⋅⋅,W (τl) − W (ηl))
where
the intervals,
(ηj,τj)
are disjoint and each τ_{j}≤ s. For example, suppose you
have
a(W (2)− W (1))+ b(W (2)− W (0))+ c(W (3)− W (1)),
where obviously the increments are not disjoint. Then you would write the above
expression as
a(W (2)− W (1))+ b(W (2)− W (1))+ b(W (1)− W (0))
+c(W (3)− W (2))+ c(W (2)− W (1))
and then you would collect the terms to obtain
b(W (1)− W (0))+ (a+ b+ c)(W (2)− W (1))+ c(W (3)− W (2))
and now these increments are disjoint.
Therefore, by independence of the increments,
E(expi(t∗ ⋅(W (u1)− W (r1),⋅⋅⋅,W (um )− W (rm))+ s∗(g(W (t)− W (s)))))
∗ ∗
= E (expi(g ⋅(W (τ1)− W (η1),⋅⋅⋅,W (τl)− W (ηl))+ s (g(W (t)− W (s)))))
∏l ∗
= E(exp(igj(W (τj)− W (ηj)))) E(exp(is (g(W (t) − W (s)))))
j=1
= E (exp (i(t∗ ⋅(W (u )− W (r ),⋅⋅⋅,W (u )− W (r )))))⋅
∗ 1 1 m m
E (exp (is (g(W (t)− W (s))))).
Now by Theorem 58.13.3, 63.4.15 follows. Next pass to the limit in both sides of 63.4.15
as ε → 0. One can do this because of 63.4.13 which implies the functions in the
integrands are uniformly integrable and Vitali’s convergence theorem, Theorem 19.5.7.
This yields 63.4.14.
Now consider the part about the stochastic process being a martingale. Let g be the
identity map. If A ∈ℱ_{s}, the above implies
∫ ∫ ∫ ∫
E (W (t)|ℱs )dP = W (t)dP = (W (t)− W (s))dP + W (s)dP
A A ∫ A ∫ A ∫
= P (A) (W (t)− W (s))dP + W (s)dP = W (s)dP
Ω A A
and so since A is arbitrary, E
(W (t)|ℱs )
= W
(s)
. ■
Note this implies immediately from Lemma 62.1.5 that Wiener process is not
of bounded variation on any interval. This is because this lemma implies if it
were of bounded variation, then it would be constant which is not the case due
to
(√ ---- )
ℒ (W (t)− W (s)) = ℒ (W (t− s)) = ℒ t− sW (1) .
Here is an interesting theorem about approximation.
Theorem 63.4.3Let
{W (t)}
be a Wiener process having values in a separableBanach space as described in Theorem 63.3.4. There exists a set of measure 0, Nsuch that for ω
Recall Lemma 58.15.6 stated below for convenience.
Lemma 63.4.4Let
{ζk}
be a sequence of random variables having values in a separablereal Banach space, E whose distributions are symmetric. Letting S_{k}≡∑_{i=1}^{k}ζ_{i}, suppose
{Snk}
converges a.e. Also suppose that for every m > n_{k},
P ([||Sm − Snk||E > 2− k]) < 2−k. (63.4.16)
(63.4.16)
Then in fact,
Sk (ω) → S(ω) a.e.ω (63.4.17)
(63.4.17)
Apply this lemma to the situation in which the Banach space, E is C
([0,T ];E )
and
ζ_{k} = ψ_{k}e_{k}. Then you can conclude uniform convergence of the partial sums,
∑m
ψk (t)ek.
k=1
This proves the theorem.
Why is C
([0,T ];E)
separable? You can assume without loss of generality that the
interval is
[0,1]
and consider the Bernstein polynomials
∑n ( n ) ( k)
pn(t) ≡ k f -- tk(1− t)n− k
k=0 n
These converge uniformly to f Now look at all polynomials of the form
∑n k( k)
akt 1− t
k=0
where the a_{k} is one of the countable dense set and n ∈ ℕ. Each Bernstein polynomial
uniformly close to one of these and also uniformly close to f. Hence polynomials of this
sort are countable and dense in C