3 Itô Calculus Flashcards
Itô calculus
the main tool for studying stochastic differential equations
e.g. consider differential equation
x’(t)=0.5x(t)
x(0)=1
solution is….graph in next example
x(t)= Cexp(0.5t)
In general for diff eq
X’(t)=f(X(t)) sol
can be approximated by defining
X(tₙ₊₁) = X(tₙ) + (tₙ₊₁ − tₙ)(f(X(tₙ)),
as X’(t) is a limiting equation
X’(t) ~ (X(t+h)-X(t)) /h
giving the limit of a recursion
Consider a graph (time t against X(t)) looks like RHS of x^2 (X(1)=35)
x(t)= Cexp(0.5t)
the graph of a trajectory of the dynamical system
X(tₙ₊₁) = X(tₙ) + (tₙ₊₁ − tₙ)f(X(tₙ)),
X(0) = x
with f(x) = 6x, starting from x = 0.1, and time-step given by tₙ₊₁ − tₙ = 1/100 (interpolated
between the discrete time points).
This type of dynamics is a good approximation of the evolution of many quantities found in nature. However, in practice, the majority of systems in physics, finance,
and biology, among others, are noisy
noisy?
Let us look for example a stock price, and more specifically,
lets say the stock price of Apple the last 10 years. While there is obviously an exponential trend,
the price does not follow a “smooth” trajectory as in prev e.g. but rather fluctuates around it
differential equation is limiting!
we observe what happens at discrete times?
X(tₙ₊₁) = X(tₙ) + (tₙ₊₁ − tₙ)f(X(tₙ)),
Adding this noisyness
what we do is we take the original trajectory:
X(tₙ₊₁) = X(tₙ) + (tₙ₊₁ − tₙ)f(X(tₙ)),
and we add in a random variable ξₙ :
X(tₙ₊₁) = X(tₙ) + (tₙ₊₁ − tₙ)f(X(tₙ))+σ(X(tₙ))ξₙ,
RV ξₙ are gaussian :
ξₙ~N(0, tₙ-tₙ₊₁)
X(0) = x,
independent
original trajectory:
X(tₙ₊₁) = X(tₙ) + (tₙ₊₁ − tₙ)f(X(tₙ)),
and we add in a random variable ξₙ :
X(tₙ₊₁) = X(tₙ) + (tₙ₊₁ − tₙ)f(X(tₙ))+σ(X(tₙ))ξₙ,
RV ξₙ are gaussian :
ξₙ~N(0, tₙ-tₙ₊₁)
X(0) = x,
independent
(motivation)
Example in notes:
ξₙ~N(0,1/n)
and
σ(x)=x
where ξₙ can be modelled by Brownian increments W(tₙ)-W(tₙ₊₁)) (these have the property we need)
replacing gives:
[X(tₙ₊₁) - X(tₙ)]/[(tₙ₊₁ − tₙ)] = f(X(tₙ))+σ(X(tₙ))ξₙ/[tₙ₊₁ − tₙ]
[X(tₙ₊₁) - X(tₙ)]/[(tₙ₊₁ − tₙ)] = f(X(tₙ))+σ(X(tₙ))[W(tₙ)-W(tₙ₊₁)][tₙ₊₁ − tₙ]
[X(tₙ₊₁) - X(tₙ)]/[(tₙ₊₁ − tₙ)] = f(X(tₙ))+σ(X(tₙ))[W(tₙ)-W(tₙ₊₁)][tₙ₊₁ − tₙ]
as
[tₙ₊₁ − tₙ]→ 0?
(motivation)
ie when σ ≡ 0 the above system leads to
dX(t)/dt = f(X(t)) X(0)=x
σ not equal to 0:
dX(t)/dt =
f(X(t)) + σ(X(t)) dW(t)/dt ,
X(0) = x.
We have shown W not differentiable, the limit .derivative doesnt exist.
we need more tools to deal with stochastic diff equations!
Simple case:
σ(x) ≡ 1
dX(t)/dt =
f(X(t)) + σ(X(t)) dW(t)/dt ,
X(0) = x.
dX(t)/dt =
f(X(t)) + dW(t)/dt ,
X(0) = x.
integrating
X(t)= x+ ∫₀ᵗ f(X(s)) ds + W(t).
(eq makes sense)
e.g trivial is BM
if f(t)=1 then integral = W(t)
σ(x) ≡ not a constant
dX(t)/dt =
f(X(t)) + σ(X(t)) dW(t)/dt ,
X(0) = x.
integrating
X(t)=
x+ ∫₀ᵗ f(X(s)) ds
+∫₀ᵗ σ(X(t)) dW(s).
BUT
∫₀ᵗ σ(X(s)) dW(s)?
well W is not in C^1 with probability one
we will consider about convergence in different ways and our aim is to look at these integrals
∫₀ᵗ σ(X(s)) dW(s)?
RECALL
∫₀¹ g(s) dv(s).
for continuous funct g
if v ∈ C^1
, then the above integral is defined and is given by
∫₀¹ g(s) dv(s)
=
∫₀¹ g(s) dv(s)/ds . ds
(you’d take a partition and look at partial sums corresponding to it, see notes for further,
diagram Riemann integral, limit doesnt always exist- depends how regular g and v are
consider if alpha-holder continuous)
we will construct the integral
∫₀ᵗ Y(s) dW(s)
as
integral of Y against BM:
With
Y(t) as a stochastic process
We fix a probability space
(Ω, F, P) with a
**complete
and
right continuous filtration **
F := (F_t)_{t∈[0,T]}.
We assume that on Ω we are given an F-Wiener process W.
NOTE:
right continuous filtration **
F := (F_t)_{t∈[0,T]}.
usually
F is a wiener process
Filtration is
Fₜ= σ(W(s), s≤t )
= σ(W⁻¹(A), A in B(R) s≤t)
inverse images of Borel sets
Rememebr if we say a RV is measurable wrt. F_t it means that it is a function of your wiener process/path of your wiener process but only for time before t
Firstly consider Y…
Y(t)= 1_{[0,1]} (t)
Considering
∫₀ᵗ Y(s) dW(s)
Consider Y s.t its a function of time
Y(t)= 1_{[0,1]} (t)
graph
for t>0
1 for t in [0,1]
0 after
this is a deterministic function.
∫₀ᵗ Y(s) dW(s) =
(∫₀ᵗ 1. dW(s) )
(∫₀¹ 1. dW(s) + 0??)
=W(t)
(using increments and function g,
∫₀¹ 1. dg(s) = ∫₀¹ g’(s). ds = g(1)-g(0))
Now consider
∫₀^T of 1_[ₐ,ᵦ] (t) dW(t)
we expect this to be
∫ₐᵇ dW(t)/dt /dt
=W(b)-W(a)
but we will show this?
“=” not =
Definition 3.2.1.
in the class Hₜˢᵗᵉᵖ
We say that a stochastic process Y : Ω × [0, T] → R is in the class
Hₜˢᵗᵉᵖ
if there
exists n ∈ N and 0 ≤ t₁ < … < tₙ < tₙ₊₁ ≤ T
(and RV Y_1,…Y_n)
such that
1) Y(t)= Σᵢ₌₁ⁿ Yₜ_ᵢ 1_[tᵢ,tᵢ₊₁] (t)
2)Yₜ_ᵢ is Fₜ_ᵢ- measurable for all i = 1, …, n,
- ||Yₜ_ᵢ||_L∞ < ∞
for all i = 1, …, n.
first two are most important
elements of class Hₜˢᵗᵉᵖ are called
simple processes
we will construct stochastic processes from these as we know how to integrate the indicator functs/ characteristic funct
not unique can be written in multiple ways
we will consider such simple processes whose “step length” becomes closer to 0 to approximate the stochastic process, like partial Riemann sums to see if they converge
this is one realisation for an omega, if we change this the realisation changes
simple processes look like
look like
step functs
0 everywhere else 1 for interval t_i- t_i+1, multiplying these gives values Y_(t_i), step functions that are right continuous
the heights are random remember
but each function is t_i measurable, we cant take information from the future only the past
if we integrate against W we would expect this as the sum of Y_(t_i)(W(t_i+1)-W(t_i))
- ||Yₜ_ᵢ||_L∞ < ∞
for all i = 1, …, n. meaning
X is in L∞(Ω)
if there exists M in reals
s.t
P(|x|≤M)=1
If X was binomial: can only take values 0 or 1, so yes we can find M>1
If X Gaussian:
it isnt because the density shows X is everywhere by defn
For Y ∈ Hₜˢᵗᵉᵖ
integrals wrt W
DEFN
stochastic integral I(Y )
For Y ∈ H_T ˢᵗᵉᵖ
we set
I(Y)=
∫₀T Y(s) dW(s)
:=Σᵢ₌₁ⁿ Yₜ_ᵢ (W(tᵢ₊₁)-W(tᵢ))
(This is the sum of RVS, thus is a RV itself, we can discuss the expectation and variance)
stochastic integral I(Y )
REMARK
Notice that the stochastic integral I(Y ) is well defined (does not depend on the
partition t_1, …, t_n) and is a linear operator on Hₜˢᵗᵉᵖ. Also notice that for Y ∈ Hₜˢᵗᵉᵖ we have that I(Y ) ∈ L_2(Ω).
Indeed, it suffices to check that this is the case for Y (t) = Y_t_11_[t1,t1+1)
From Proposition 1.5.7 we have
E[|Yₜ_₁²|(W(t₂) − W(t₁))²]
= E[E[|Yₜ_₁|² (W(t₂) − W(t))²|Fₜ_₁]]
= E[|Yₜ_₁|² E[(W(t₂) − W(t))²|Fₜ_₁]]
=(t₂ − t₁)E[|Yₜ_₁|² ]
≤ (t₂ − t₁)||Yₜ_₁||²_L∞ < ∞,
which shows that I(Y ) ∈ L₂(Ω)
Exercise 3.2.3. Show that the stochastic integral I : Hₜˢᵗᵉᵖ → L₂(Ω) is a ** linear operator**,
that is, if Y₁, Y₂ ∈ Hₜ and a₁, a₂ ∈ R, then we have that
I(a₁Y₁ + a₂Y₂) = a₁I(Y₁) + a₂I(Y₂).
∫₀T a₁Y₁(s) + a₂Y₂(s) dW(s)
OPERATOR IS LINEAR
Suppose Y₁, Y₂ ∈ Hₜ and a₁, a₂ ∈ R, then we have that
I(a₁Y₁ + a₂Y₂) =
a₁∫₀T Y₁(s) dW(s)
+ a₂∫₀T Y₂(s) dW(s)
:=a₁Σᵢ₌₁ⁿ Y₁,ₜ_ᵢ (W(tᵢ₊₁)-W(tᵢ)) + a₂Σᵢ₌₁ⁿ Y₂, ₜ_ᵢ (W(tᵢ₊₁)-W(tᵢ))
=:a₁I(Y₁) + a₂I(Y₂).
he used T=1
Suppose we have function f: [0,1] → R
Define
∫₀1 f(s).ds
The Riemann integral will be the area under the curve by partition rectangles and calculating the area under. As the width of the partition interval tends to 0 we approach a limit if it exists
We could also define a function f_n defined as
f_n(s)=
Σᵢ₌₁ⁿ f( tᵢⁿ )1_[tᵢⁿ, tᵢ+₁ ⁿ] (t)
These are n constant functions on n intervals
we know how to find the integral for such functions
∫₀1f_n(s)
= Σᵢ₌₁ⁿ f( tᵢⁿ ) (tᵢ+₁ ⁿ-tᵢⁿ)
If this converges to some limit, this could be the integral if it is the limit for all partitions
STEP 1
find the sequence f_n that converges to f
STEP 2
f_n to be simple
STEP 3
ensure limit doesn’t depend on what partition you chose
Applying the prev steps to consider the integral of a stochastic process
Given a stochastic process (Y(t))_t in [0,1]
STEP 1
(Y_n(t))_ t in [0,1] in Hₜˢᵗᵉᵖ
find a sequence s.t simple and we can find the integrals
∫₀1 Yₙ(t) dW(s)
and that
STEP 2
(Yₙ(t))t in [0,1] converge to stochastic process (Y(t)) t in [0,1]
(Yₙ(t))_t in [0,1]→ (Y(t))_t in [0,1]
STEP 3
I want to show there exists a RV Z s.t
∫₀1 Yₙ(t) dW(s) → Z
Z will be the stochastic integral of Y
Lemma 3.2.4.
For Y ∈ Hₜˢᵗᵉᵖ
we have
E[|I(Y )|²] (second moment)
E[I(Y )]
For Y ∈ H_T ˢᵗᵉᵖ we have
E[|I(Y )|²]
= E[ ∫₀T|Y (s)|² ds]
E[I(Y )] = 0.
(RV ∫₀1 Yₙ(t) dW(s) has finite second moment)
(MEAN is 0: RV as sum of RV x increment of brownian motion, centred RVs)
PROOF:
(RV has finite second moment: expectation of square is finite)
considering
E[ ∫₀T|Y (s)|² dW(s):
find the norm in L₂
‖∫₀¹|Y(s) dW(s)‖² L₂
=:
E[ | ∫₀¹Y(s) dW(s)|²]
≤2ⁿ Σᵢ₌₁ⁿE[|Yₙ|²(W)(tᵢ₊₁)-W(tᵢ)²]
(2ⁿ works so might others, by defn we know these are bounded we can take them out in the L_infinity norm)
≤2ⁿ Σᵢ[|Yₙ|²_L∞] Σᵢ₌₁ⁿE[(W)(tᵢ₊₁)-W(tᵢ))²]
(we know this expectation is tᵢ₊₁-tᵢ CAREFUL)
<∞
By definition: I as a stochastic integral
(Y_n(t))_ t in [0,1] in Hₜˢᵗᵉᵖ
I: Hₜˢᵗᵉᵖ → L₂(Ω)
(Yₙ(t))₎ₜ ᵢₙ [₀,₁]
→
I(Y)=
∫₀¹Y(s) dW(s)
its a function that takes a simple process and gives a random variable
stochastic process
integrated wrt BM
operator with properties
If I take two stochastic processes why is the sum a simple process
convince yourself
if we multiply the step with a real we also get a step
we can define uniquely from disjoint sets
Considering:
Y(t) ²=
:=Σᵢ₌₁ⁿ Yₜ_ᵢ² 1_[tᵢ₊₁,tᵢ]
Considering:
Y(t) ²=
:=Σᵢ₌₁ⁿ Yₜ_ᵢ² 1_[tᵢ₊₁,tᵢ]
this is because the intervals are pairwise disjoint so only i=j are 1
if I fix omega this is just a step function
so for each omega i can find the Riemann integral
∫₀¹Y(s) ² dW(s)
= Σᵢ₌₁ⁿ Yₜ_ᵢ² (tᵢ₊₁-tᵢ)
which is again a RV
then I take the expectation.
Note that:
E[∫₀¹Y(s) .dW(s))²]
=
‖∫₀¹|Y(s) dW(s)‖² L₂
=
‖ (Y(s))_t i [0,1] ‖²_L₂(Ωx[0,1])
RHS:
= E[ ∫₀¹|Y (s)|² ds]=
∫_Ω ∫₀¹(Y (s))² ds dP
As the integral operator takes a stochastic process and gives a RV, norm of the size of object same as norm of the image (ito’s isometry)
For Y ∈ H_T ˢᵗᵉᵖ we have
(1)
E[|I(Y )|²]
= E[ ∫₀T|Y (s)|² ds]
(2)
E[I(Y )] = 0.
LEMMA PROOF
(1)
PROOF:
1)
Y ∈ H_T ˢᵗᵉᵖ
so
Y(t) =
:=Σᵢ₌₁ⁿ Yₜ_ᵢ 1_[tᵢ₊₁,tᵢ]
where Yₜ_ᵢ is F_t measurable and bounded
Consider E[(∫₀¹Y (s) dW(s) )²]=
E[ (Σᵢ₌₁ⁿ Yₜ_ᵢ(W(tᵢ₊₁)-W(tᵢ)) ²]
=
E[Σᵢ,ⱼ₌₁ⁿ
Yₜ_ᵢ(W(tᵢ₊₁)-W(tᵢ)) Yₜ_ⱼ(W(tⱼ₊₁)-W(tⱼ))]
= (diagonal elements taken out)
E[Σᵢⁿ
Y²ₜ_ᵢ(W(tᵢ₊₁)-W(tᵢ))²
+ Σᵢ≠ⱼⁿ 2Yₜ_ᵢ(W(tᵢ₊₁)-W(tᵢ)) Yₜ_ⱼ(W(tⱼ₊₁)-W(tⱼ))]
=
E[Σᵢⁿ
Y²ₜ_ᵢ(W(tᵢ₊₁)-W(tᵢ))²
+
E[ Σᵢ≠ⱼ 2Yₜ_ᵢ(W(tᵢ₊₁)-W(tᵢ)) Yₜ_ⱼ(W(tⱼ₊₁)-W(tⱼ))]
For i≠j:
WLOG i<j
(considering intervals
maybe they overlap at t_i+i but increasing sequence of partitions
t_1 <=t_2<=….)
Consider
E[ Yₜ_ᵢ(W(tᵢ₊₁)-W(tᵢ)) Yₜ_ⱼ(W(tⱼ₊₁)-W(tⱼ))]
now
W(tᵢ₊₁)-W(tᵢ) is F_tᵢ₊₁ measurable
Yₜ_ᵢ(W(tᵢ₊₁)-W(tᵢ)) is also as Yₜ_ᵢ is
(he said they are also tⱼ measurable???)
W(tⱼ₊₁)-W(tⱼ) is F_tⱼ₊₁ and hence also F_t_j?? measurable
so we use conditional expectation
E[ Yₜ_ᵢ(W(tᵢ₊₁)-W(tᵢ)) Yₜ_ⱼ(W(tⱼ₊₁)-W(tⱼ))]=
E[E[Yₜ_ᵢ(W(tᵢ₊₁)-W(tᵢ)) Yₜ_ⱼ(W(tⱼ₊₁)-W(tⱼ))|F_tⱼ]]
But Yₜ_ᵢ(W(tᵢ₊₁)-W(tᵢ) Yₜ_ⱼ is F_t_j measurable
= E[Yₜ_ᵢ(W(tᵢ₊₁)-W(tᵢ)) Yₜ_ⱼE[(W(tⱼ₊₁)-W(tⱼ))|F_tⱼ]]
W is a BM w.rt filtration F
so increments are indpendent so cond exp= exp which is 0 meaning this is 0
considering
now
E[Σᵢⁿ
Y²ₜ_ᵢ(W(tᵢ₊₁)-W(tᵢ))² ]
as E[Σᵢⁿ
Yₜ_ᵢ are F_t_i measurable and the increment (W(tᵢ₊₁)-W(tᵢ))² is independent of F_t_i
so
=
E[E[Σᵢⁿ
Y²ₜ_ᵢ(W(tᵢ₊₁)-W(tᵢ))²|F_tᵢ]]
linearity of expectation we can take the sum out
=E[Σᵢⁿ E[
Y²ₜ_ᵢ(W(tᵢ₊₁)-W(tᵢ))²|F_tᵢ]]
now Y²ₜ_ᵢis F_t_i measurable so I can take outside of conditional expectation
=E[ΣᵢⁿY²ₜ_ᵢ E[
(W(tᵢ₊₁)-W(tᵢ))²|F_tᵢ]]
I have an increment of BM which is independent to F_t_i so conditional expectation is the usual expectation
=E[ΣᵢⁿY²ₜ_ᵢ (tᵢ₊₁-tᵢ)|F_tᵢ]]
(important to follow this proof and how we use properties hint hint)
which is the same as the riemann integral for
=E[∫₀¹ Y²(s).ds]
we have shown (1)
Filtration later times?
In the context of a filtration F on a probability space, if something is
F_tᵢ -measurable, then it is also F_tⱼ -measurable for
tᵢ≤tⱼ
The intuition behind this is that
F_tᵢ represents all the information available up to time tᵢ , and if something is measurable with respect to this information, then it’s also measurable with respect to all the information available at a later time tⱼ , where
tⱼ≥tᵢ
In other words, as time progresses, more information becomes available, so anything measurable at
t_ᵢ will also be measurable at any later time tⱼ.
DEF 3.2.5
I(Y, t)
For Y ∈ H_T ˢᵗᵉᵖ
let us define for t ∈ [0, T]
I(Y, t) =
∫₀ᵗ Y (s) dW(s) := I(Y 1_[0,t))
Lemma 3.2.6.
The process (I(Y, t))_t∈[0,T]
is a …
The process (I(Y, t))t∈[0,T]
(integral[0,t] Y(s) dW(s))
is a continuous martingale with respect to F.
proof: see notes
(F-wiener process, filtration (W(t))_ t in [0,T]
Lemma 3.2.7 for Y ∈ H_T ˢᵗᵉᵖ
E[] ≤ 4E[]
missed out, seen similar in a proof?
Y ∈ H_T ˢᵗᵉᵖ we have
E[ sup_ₜ∈[₀,ₜ] |I(Y,t)|² ]
≤ 4E[ ∫₀T|Y (s)|² ds
proof:
This is a direct consequence of Doob’s inequality. Indeed, since the stochastic integral is a continuous martingale, we have by Theorem 2.1.12 and Lemma 3.2.4
E[ sup_ₜ∈[₀,ₜ] |I(Y,t)|² ]
≤ 4E[|I(Y,T)|²]
= 4E[ ∫₀T|Y (s)|² ds
remark ito’s isometry
say we take a linear function on x in reals
T(x)
T:R to R
if its linear form ax
|T(x)|=|x| in our case
isometry
crucial for our stochastic integral defn
For Y ∈ H_T ˢᵗᵉᵖ we have
(1)
E[|I(Y )|²]
= E[ ∫₀T|Y (s)|² ds]
(2)
E[I(Y )] = 0.
LEMMA PROOF
(2)
E[∫₀¹ Y (s) dW(s)]
E[Σᵢⁿ Yₜ_ᵢ(W(tᵢ₊₁)-W(tᵢ)]
=
Σᵢⁿ E[Yₜ_ᵢ(W(tᵢ₊₁)-W(tᵢ)]
=
Σᵢⁿ E[E[(Yₜ_ᵢ(W(tᵢ₊₁)-W(tᵢ)) | Fₜ_ᵢ]]
=
ΣᵢⁿE[Yₜ_ᵢ E[((W(tᵢ₊₁)-W(tᵢ)) | Fₜ_ᵢ]]
=
ΣᵢⁿE[Yₜ_ᵢ E[((W(tᵢ₊₁)-W(tᵢ))]]
= 0 as E[((W(tᵢ₊₁)-W(tᵢ)) =0
IN SUMMARY: ito integral
If I take a simple stochastic process
I can define the stochastic integral
which is a random var
with
mean 0
finite second moment
which is given by the integral of the square of the process
if Y ∈ Hˢᵗᵉᵖ
If Y ∈ Hˢᵗᵉᵖ :
then Integral
∫₀¹Y (s) dW(s) ∈ L₂(Ω)
norm
‖∫₀¹ Y(s) dW(s) ‖² L₂
=
E[(∫₀¹Y (s) dW(s) )² ]
=
E[∫₀¹Y(s)² ds ]
and
E[∫₀¹Y (s) dW(s) ]=0
Define
Hˢᵗᵉᵖ⊆ H
GIVEN WITHOUT THE PROOF
H= {(Y(t))_t∈[0,1] |Y is adapted and E[∫₀¹Y(s)² ds <∞ }
(1)
take all stochastic processes Y that are adapted to a filtration
ie that for all y ∈ [0,1] its F_t measurable
ie. a RV Y(t) only depends on the values of brownian motion at time t
ie the sigma-algebra generated by BM
σ(W(s), s ≤ t)
(2)
satisfy that exp of square integrated = is finite
we can think of it as the Riemann integral. Is it well defined, its a lebesgue integral but we consider it as one (dw). So these are not simple anymore
H doesnt look like a step function
Definition 3.2.8.
We say that a stochastic process Y : Ω × [0, T] → R is in the class H_T , if
We say that a stochastic process Y : Ω × [0, T] → R is in the class H_T , if it is
adapted to the filtration F and it satisfies
E[∫₀T |Y (s)|²ds]
< ∞.
suppose we are allowed to observe the path of BM up to time t. Then you would determine the value of Y_t. So if we had access to all values of the path up to time t it wouldn’t be a RANDOM var anymore!
σ(W(s), s ≤ t)
Y_t conditionally on F_t is not a RV but behaves like a constant. This is what adaptedness is.
So if we only consider those that are adapted we have this
Itô’s Isometry
E[|I(Y )|²]
= E[ ∫₀T|Y (s)|² ds]
because it shows that if we see H_Tˢᵗᵉᵖ as a
subspace of L_2(Ω × R+), then the stochastic integral
I : H_Tˢᵗᵉᵖ → L_2(Ω) is an isometry.
Proposition ???
Let Hˢᵗᵉᵖ⊆ H
sequences of simple processes s.t sequence approximates Y for expectation
Let (Y(t))t∈[0,1] ∈H
Then there exists a sequence Yₙ(t))t∈[0,1]∈ Hˢᵗᵉᵖ for n ∈N
s.t.
limit n to infinity
E[∫₀¹ (Yₙ(s)-Y(s))² .ds]=0
Let Y∈H
Let Y∈H
(That means Y is a:
stochastic process which is adapted and
doesnt depend on values of BM from the future, only past
and it has a finite expectation of integral of square)
Lemma 3.2.9.
H_T ˢᵗᵉᵖ subset of H
If Y ∈H_T
then there exists
Yₙ ∈ H_T ˢᵗᵉᵖ, for n ∈ N such that
lim_{n→∞}
E[
∫₀T |Y (s) − Yₙ(s)|²ds]
= 0
Proposition 3.2.10.
Let Y ∈ H and let (Yₙ)ₙ₌₁∞ ⊂ H_T ˢᵗᵉᵖ be the sequence from Lemma 3.2.9.
Then there exists a random variable Z ∈ L_2(Ω). such that lim_{n to ∞} ‖I(Yn) − Z‖ = 0.
Original aim was to…
define for function integral of funct
considering sequence of funct converging to this funct
and our integral will also converge to integral of funct
we don’t rigourously prove the lemma
For the family of functions we consider for a particular omega
Suppose have a sequence of reals a_n
cauchy if
the limit of n,m to infinity of |a_n-a_m| tends to 0
if for all epsilon there exists N s.t for all n,m >= N |a_n-a_m| <epsilon
terms get closer
every cauchy sequence converges (in a MS,in the R)
thus there always exists a limit
this also occurs in the L_2 space
If you take a sequence of RVs (X_n) which is cauchy
in L_2 space it means
Lim_{n,m tends to infinity} of
‖X_n-X_m‖_L₂
=
Lim_{n,m tends to infinity} of
(E[(X_n-X_m)²])¹/²
=0
Thus
Lim_{n,m tends to infinity} of
(E[(X_n-X_m)²]) =0
thus there exists a RV Z In L₂(Ω) s.t
‖Z - X_n‖_L₂
s.t
Lim n to infinity [
E[ (X_n-Z)²] =0
Using this to construct a stochastic integral
Suppose we have a process in H:
Y ∈ H
Then by lemma, there exists
Yₙ ∈ H_T ˢᵗᵉᵖ (simple)
s.t
lim_{n→∞}
E[
∫₀¹|Y (s) − Yₙ(s)|²ds]
= 0
(this also implies its cauchy…)
I look at
∫₀¹ Yₙ(s)dW(s)
and we show it is a cauchy sequence…
Thus there exists
Z in
L₂(Ω)
s.t
E[ (∫₀¹ Yₙ(s)dW(s) -Z)²]
converges to 0 as n tends to infinity and this limit will be our value we use
∫₀¹ Yₙ(s)dW(s)
and we show it is a cauchy sequence
∫₀¹ Yₙ(s)dW(s)
and we show it is a cauchy sequence
∫₀¹ Yₙ(s)dW(s) - ∫₀¹ Yₘ(s)dW(s)
(in L_2 norm??)
E[(∫₀¹ Yₙ(s)dW(s) - ∫₀¹ Yₘ(s)dW(s))²]
(stochastic integral is linear)
=
E[(∫₀¹ (Yₙ(s)-Yₘ(s))dW(s))²]
it is isometry, expectation of the square …
=E[(∫₀¹ (Yₙ(s)-Yₘ(s))²ds)]
by triangle inequality
≤
4E[∫₀¹ (Yₙ(s)-Y(s))²ds]
+
4E[∫₀¹ (Y(s)-Yₘ(s))²ds]
both tend to 0
as n,m tends to infinity
means the sequence of stochastic integrals is cauchy
in L2 which is complete thus it converges
Limit is a RV Z
SUMMARY LECTURE
(Ω,F,P)
filtration
F = {Fₜ}_ ₜ in [0,1]
BM
(W(t))_t in [0,1],
F-wiener process
our goal was to construct
stochastic integral
∫₀T Y(s) dW(s)
we started looking at simple processes
STEP 1
(Y(t))_{t∈ [0,1]} ∈ H_T ˢᵗᵉᵖ
Y(t) =Σᵢ₌₁ⁿ Yₜ_ᵢ 1_[tᵢ,tᵢ₊₁] (t)
THUS
∫₀T Y(s) dW(s):=
Σᵢ₌₁ⁿ Yₜ_ᵢ (W(tᵢ₊₁)-W(tᵢ,)) ∈ L₂(Ω)
which is a RV in L_2
which we saw had some properties
STEP 2
This was for a simple process, then we extended this to approximate
Consider class H_T , if it is
adapted to the filtration F and it satisfies
E[∫₀T |Y (s)|²ds]
< ∞.
If we take a Y in H_T then there exists a sequence of simple processes
Yₙ ∈ H_T ˢᵗᵉᵖ (simple)
s.t
lim_{n→∞}
E[
∫₀¹|Y (s) − Yₙ(s)|²ds]
= 0
we used this to construct the stochastic integral for Y
By seeing the integrals of sequence form a cauchy sequence (RVs) converging in L_2
Exercise 3.2.11.
Show that Z does not depend on the approximating sequence (Yₙ)ₙ₌₁∞. That is,
show that if (Y˜ₙ)ₙ₌₁∞ ⊂
is another sequence satisfying (3.9), then
lim_{n→∞} ‖ I(Y˜ₙ) − Z‖_L_2 = 0
.
SUMMARY
∫₀T Y(s) dW(s):=
Σᵢ₌₁ⁿ Yₜ_ᵢ (W(tᵢ₊₁)-W(tᵢ,)) ∈ L₂(Ω)
which is a RV in L_2
which we saw had some properties:
∫₀T Y(s) dW(s)
PROPERTIES FOR Y∈ H_T ˢᵗᵉᵖ
linear operator,
if Y₁, Y₂ ∈ Hₜ and a₁, a₂ ∈ R, then we have that
I(a₁Y₁ + a₂Y₂) = a₁I(Y₁) + a₂I(Y₂)
Exp = 0
E[∫₀T Y(s) dW(s)]=0
var =
E[ (∫₀T Y(s) dW(s))² ]
= E[ ∫₀T Y(s)² d(s)]
we will see
(∫₀ᵗ Y(s) dW(s))_(t in [0,T])
is a martingale
Take a stochastic process
Y∈ H_T then PROPERTIES
linear operator,
if Y₁, Y₂ ∈ Hₜ and a₁, a₂ ∈ R, then we have that
I(a₁Y₁ + a₂Y₂) = a₁I(Y₁) + a₂I(Y₂)
Exp = 0
E[∫₀T Y(s) dW(s)]=0
var =
E[ (∫₀T Y(s) dW(s))² ]
= E[ ∫₀T Y(s)² d(s)]
If we
(Y_n(t))_ t in [0,1] in Hₜˢᵗᵉᵖ
I: Hₜˢᵗᵉᵖ → L₂(Ω)
(Yₙ(t))₎ₜ ᵢₙ [₀,₁]
→
I(Y)=
∫₀¹Y(s) dW(s)…
(Yₙ(t))₎ₜ ᵢₙ [₀,₁]
→
I(Y)=
∫₀TY(s) dW(s)
(stochastic process)
→
(random variable)
I want to consider now (for little t)
∫₀ᵗ Y(s) dW(s)
for each t i have a RV
so I consider a family of RVs
Consider
∫₀ᵗ Y(s) dW(s)
for each t i have a RV
so I consider
(∫₀ᵗ Y(s) dW(s))_ t in [0,T]
a family of RVs
which is a stochastic process
TRUE OR FALSE?
Take Y∈ H_T then
1_[0,t] Y ∈ H_T
STOCHASTIC PROCESS
a new stochastic process
TRUE BUT if i take a stochastic process
(Yₙ(r))₎_r ᵢₙ [₀,T] ∈ H_T
then
I have stochastic process
(1[0,t] Y(r)) r ᵢₙ [₀,T] ∈ H_T
Note:
This new stochastic process is equal to the stochastic process Y when r is in [0,t]
if r>t then its 0
=
{Y(r) if r <=t
{0 r>t
So trajectories in this case look like:
From a simple process which is (continuous on the left)
becomes
the same but only for time <=t
- because it is ∈ H_T we can integrate and value
∫₀ᵗ Y(s) dW(s) =
∫₀T 1_[0,t] (s) Y(s) dW(s)