3 Itô Calculus Flashcards

1
Q

Itô calculus

A

the main tool for studying stochastic differential equations

e.g. consider differential equation
x’(t)=0.5x(t)
x(0)=1

solution is….graph in next example
x(t)= Cexp(0.5t)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

In general for diff eq
X’(t)=f(X(t)) sol

A

can be approximated by defining
X(tₙ₊₁) = X(tₙ) + (tₙ₊₁ − tₙ)(f(X(tₙ)),

as X’(t) is a limiting equation
X’(t) ~ (X(t+h)-X(t)) /h

giving the limit of a recursion

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Consider a graph (time t against X(t)) looks like RHS of x^2 (X(1)=35)
x(t)= Cexp(0.5t)

the graph of a trajectory of the dynamical system

X(tₙ₊₁) = X(tₙ) + (tₙ₊₁ − tₙ)f(X(tₙ)),

X(0) = x

with f(x) = 6x, starting from x = 0.1, and time-step given by tₙ₊₁ − tₙ = 1/100 (interpolated
between the discrete time points).

A

This type of dynamics is a good approximation of the evolution of many quantities found in nature. However, in practice, the majority of systems in physics, finance,
and biology, among others, are noisy

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

noisy?

A

Let us look for example a stock price, and more specifically,
lets say the stock price of Apple the last 10 years. While there is obviously an exponential trend,
the price does not follow a “smooth” trajectory as in prev e.g. but rather fluctuates around it

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

differential equation is limiting!

A

we observe what happens at discrete times?

X(tₙ₊₁) = X(tₙ) + (tₙ₊₁ − tₙ)f(X(tₙ)),

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Adding this noisyness

A

what we do is we take the original trajectory:
X(tₙ₊₁) = X(tₙ) + (tₙ₊₁ − tₙ)f(X(tₙ)),
and we add in a random variable ξₙ :

X(tₙ₊₁) = X(tₙ) + (tₙ₊₁ − tₙ)f(X(tₙ))+σ(X(tₙ))ξₙ,

RV ξₙ are gaussian :
ξₙ~N(0, tₙ-tₙ₊₁)
X(0) = x,
independent

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

original trajectory:
X(tₙ₊₁) = X(tₙ) + (tₙ₊₁ − tₙ)f(X(tₙ)),
and we add in a random variable ξₙ :

X(tₙ₊₁) = X(tₙ) + (tₙ₊₁ − tₙ)f(X(tₙ))+σ(X(tₙ))ξₙ,

RV ξₙ are gaussian :
ξₙ~N(0, tₙ-tₙ₊₁)
X(0) = x,
independent
(motivation)

A

Example in notes:
ξₙ~N(0,1/n)
and
σ(x)=x

where ξₙ can be modelled by Brownian increments W(tₙ)-W(tₙ₊₁)) (these have the property we need)
replacing gives:

[X(tₙ₊₁) - X(tₙ)]/[(tₙ₊₁ − tₙ)] = f(X(tₙ))+σ(X(tₙ))ξₙ/[tₙ₊₁ − tₙ]

[X(tₙ₊₁) - X(tₙ)]/[(tₙ₊₁ − tₙ)] = f(X(tₙ))+σ(X(tₙ))[W(tₙ)-W(tₙ₊₁)][tₙ₊₁ − tₙ]

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

[X(tₙ₊₁) - X(tₙ)]/[(tₙ₊₁ − tₙ)] = f(X(tₙ))+σ(X(tₙ))[W(tₙ)-W(tₙ₊₁)][tₙ₊₁ − tₙ]
as
[tₙ₊₁ − tₙ]→ 0?
(motivation)

A

ie when σ ≡ 0 the above system leads to

dX(t)/dt = f(X(t)) X(0)=x

σ not equal to 0:
dX(t)/dt =
f(X(t)) + σ(X(t)) dW(t)/dt ,

X(0) = x.

We have shown W not differentiable, the limit .derivative doesnt exist.

we need more tools to deal with stochastic diff equations!

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Simple case:
σ(x) ≡ 1
dX(t)/dt =
f(X(t)) + σ(X(t)) dW(t)/dt ,

X(0) = x.

A

dX(t)/dt =
f(X(t)) + dW(t)/dt ,

X(0) = x.

integrating

X(t)= x+ ∫₀ᵗ f(X(s)) ds + W(t).
(eq makes sense)

e.g trivial is BM
if f(t)=1 then integral = W(t)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

σ(x) ≡ not a constant

dX(t)/dt =
f(X(t)) + σ(X(t)) dW(t)/dt ,

X(0) = x.

A

integrating

X(t)=
x+ ∫₀ᵗ f(X(s)) ds
+∫₀ᵗ σ(X(t)) dW(s).

BUT
∫₀ᵗ σ(X(s)) dW(s)?

well W is not in C^1 with probability one
we will consider about convergence in different ways and our aim is to look at these integrals

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

∫₀ᵗ σ(X(s)) dW(s)?

RECALL
∫₀¹ g(s) dv(s).
for continuous funct g

A

if v ∈ C^1
, then the above integral is defined and is given by
∫₀¹ g(s) dv(s)
=
∫₀¹ g(s) dv(s)/ds . ds

(you’d take a partition and look at partial sums corresponding to it, see notes for further,

diagram Riemann integral, limit doesnt always exist- depends how regular g and v are

consider if alpha-holder continuous)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

we will construct the integral
∫₀ᵗ Y(s) dW(s)
as

A

integral of Y against BM:
With
Y(t) as a stochastic process

We fix a probability space
(Ω, F, P) with a
**complete
and
right continuous filtration **
F := (F_t)_{t∈[0,T]}.

We assume that on Ω we are given an F-Wiener process W.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

NOTE:
right continuous filtration **
F := (F_t)_{t∈[0,T]}.

A

usually
F is a wiener process

Filtration is
Fₜ= σ(W(s), s≤t )
= σ(W⁻¹(A), A in B(R) s≤t)

inverse images of Borel sets

Rememebr if we say a RV is measurable wrt. F_t it means that it is a function of your wiener process/path of your wiener process but only for time before t

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Firstly consider Y…

Y(t)= 1_{[0,1]} (t)

Considering
∫₀ᵗ Y(s) dW(s)

A

Consider Y s.t its a function of time
Y(t)= 1_{[0,1]} (t)

graph
for t>0
1 for t in [0,1]
0 after

this is a deterministic function.

∫₀ᵗ Y(s) dW(s) =
(∫₀ᵗ 1. dW(s) )
(∫₀¹ 1. dW(s) + 0??)
=W(t)

(using increments and function g,
∫₀¹ 1. dg(s) = ∫₀¹ g’(s). ds = g(1)-g(0))

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Now consider
∫₀^T of 1_[ₐ,ᵦ] (t) dW(t)

A

we expect this to be

∫ₐᵇ dW(t)/dt /dt
=W(b)-W(a)

but we will show this?
“=” not =

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Definition 3.2.1.

in the class Hₜˢᵗᵉᵖ

A

We say that a stochastic process Y : Ω × [0, T] → R is in the class
Hₜˢᵗᵉᵖ
if there
exists n ∈ N and 0 ≤ t₁ < … < tₙ < tₙ₊₁ ≤ T

(and RV Y_1,…Y_n)
such that

1) Y(t)= Σᵢ₌₁ⁿ Yₜ_ᵢ 1_[tᵢ,tᵢ₊₁] (t)

2)Yₜ_ᵢ is Fₜ_ᵢ- measurable for all i = 1, …, n,

  1. ||Yₜ_ᵢ||_L∞ < ∞
    for all i = 1, …, n.

first two are most important

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

elements of class Hₜˢᵗᵉᵖ are called

A

simple processes

we will construct stochastic processes from these as we know how to integrate the indicator functs/ characteristic funct

not unique can be written in multiple ways

we will consider such simple processes whose “step length” becomes closer to 0 to approximate the stochastic process, like partial Riemann sums to see if they converge

this is one realisation for an omega, if we change this the realisation changes

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

simple processes look like

A

look like
step functs
0 everywhere else 1 for interval t_i- t_i+1, multiplying these gives values Y_(t_i), step functions that are right continuous
the heights are random remember

but each function is t_i measurable, we cant take information from the future only the past

if we integrate against W we would expect this as the sum of Y_(t_i)(W(t_i+1)-W(t_i))

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q
  1. ||Yₜ_ᵢ||_L∞ < ∞
    for all i = 1, …, n. meaning
A

X is in L∞(Ω)

if there exists M in reals
s.t
P(|x|≤M)=1

If X was binomial: can only take values 0 or 1, so yes we can find M>1

If X Gaussian:
it isnt because the density shows X is everywhere by defn

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
20
Q

For Y ∈ Hₜˢᵗᵉᵖ

integrals wrt W

DEFN
stochastic integral I(Y )

A

For Y ∈ H_T ˢᵗᵉᵖ
we set

I(Y)=
∫₀T Y(s) dW(s)
:=Σᵢ₌₁ⁿ Yₜ_ᵢ (W(tᵢ₊₁)-W(tᵢ))

(This is the sum of RVS, thus is a RV itself, we can discuss the expectation and variance)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
21
Q

stochastic integral I(Y )
REMARK

A

Notice that the stochastic integral I(Y ) is well defined (does not depend on the
partition t_1, …, t_n) and is a linear operator on Hₜˢᵗᵉᵖ. Also notice that for Y ∈ Hₜˢᵗᵉᵖ we have that I(Y ) ∈ L_2(Ω).

Indeed, it suffices to check that this is the case for Y (t) = Y_t_11_[t1,t1+1)

From Proposition 1.5.7 we have
E[|Yₜ_₁²|(W(t₂) − W(t₁))²]
= E[E[|Yₜ_₁|² (W(t₂) − W(t))²|Fₜ_₁]]
= E[|Yₜ_₁|² E[(W(t₂) − W(t))²|Fₜ_₁]]
=(t₂ − t₁)E[|Yₜ_₁|² ]

≤ (t₂ − t₁)||Yₜ_₁||²_L∞ < ∞,
which shows that I(Y ) ∈ L₂(Ω)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
22
Q

Exercise 3.2.3. Show that the stochastic integral I : Hₜˢᵗᵉᵖ → L₂(Ω) is a ** linear operator**,

that is, if Y₁, Y₂ ∈ Hₜ and a₁, a₂ ∈ R, then we have that
I(a₁Y₁ + a₂Y₂) = a₁I(Y₁) + a₂I(Y₂).

A

∫₀T a₁Y₁(s) + a₂Y₂(s) dW(s)

OPERATOR IS LINEAR

Suppose Y₁, Y₂ ∈ Hₜ and a₁, a₂ ∈ R, then we have that
I(a₁Y₁ + a₂Y₂) =

a₁∫₀T Y₁(s) dW(s)
+ a₂∫₀T Y₂(s) dW(s)

:=a₁Σᵢ₌₁ⁿ Y₁,ₜ_ᵢ (W(tᵢ₊₁)-W(tᵢ)) + a₂Σᵢ₌₁ⁿ Y₂, ₜ_ᵢ (W(tᵢ₊₁)-W(tᵢ))
=:a₁I(Y₁) + a₂I(Y₂).

he used T=1

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
23
Q

Suppose we have function f: [0,1] → R
Define
∫₀1 f(s).ds

A

The Riemann integral will be the area under the curve by partition rectangles and calculating the area under. As the width of the partition interval tends to 0 we approach a limit if it exists

We could also define a function f_n defined as
f_n(s)=
Σᵢ₌₁ⁿ f( tᵢⁿ )1_[tᵢⁿ, tᵢ+₁ ⁿ] (t)

These are n constant functions on n intervals
we know how to find the integral for such functions

∫₀1f_n(s)
= Σᵢ₌₁ⁿ f( tᵢⁿ ) (tᵢ+₁ ⁿ-tᵢⁿ)

If this converges to some limit, this could be the integral if it is the limit for all partitions

STEP 1
find the sequence f_n that converges to f
STEP 2
f_n to be simple
STEP 3
ensure limit doesn’t depend on what partition you chose

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
24
Q

Applying the prev steps to consider the integral of a stochastic process

A

Given a stochastic process (Y(t))_t in [0,1]

STEP 1
(Y_n(t))_ t in [0,1] in Hₜˢᵗᵉᵖ

find a sequence s.t simple and we can find the integrals
∫₀1 Yₙ(t) dW(s)

and that

STEP 2
(Yₙ(t))t in [0,1] converge to stochastic process (Y(t)) t in [0,1]

(Yₙ(t))_t in [0,1]→ (Y(t))_t in [0,1]
STEP 3
I want to show there exists a RV Z s.t
∫₀1 Yₙ(t) dW(s) → Z

Z will be the stochastic integral of Y

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
25
Q

Lemma 3.2.4.
For Y ∈ Hₜˢᵗᵉᵖ
we have
E[|I(Y )|²] (second moment)
E[I(Y )]

A

For Y ∈ H_T ˢᵗᵉᵖ we have

E[|I(Y )|²]
= E[ ∫₀T|Y (s)|² ds]

E[I(Y )] = 0.

(RV ∫₀1 Yₙ(t) dW(s) has finite second moment)

(MEAN is 0: RV as sum of RV x increment of brownian motion, centred RVs)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
26
Q

PROOF:
(RV has finite second moment: expectation of square is finite)

A

considering
E[ ∫₀T|Y (s)|² dW(s):

find the norm in L₂
‖∫₀¹|Y(s) dW(s)‖² L₂
=:
E[ | ∫₀¹Y(s) dW(s)|²]
≤2ⁿ Σᵢ₌₁ⁿE[|Yₙ|²(W)(tᵢ₊₁)-W(tᵢ)²]
(2ⁿ works so might others, by defn we know these are bounded we can take them out in the L_infinity norm)
≤2ⁿ Σᵢ[|Yₙ|²_L∞] Σᵢ₌₁ⁿE[(W)(tᵢ₊₁)-W(tᵢ))²]
(we know this expectation is tᵢ₊₁-tᵢ CAREFUL)
<∞

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
27
Q

By definition: I as a stochastic integral

A

(Y_n(t))_ t in [0,1] in Hₜˢᵗᵉᵖ

I: Hₜˢᵗᵉᵖ → L₂(Ω)

(Yₙ(t))₎ₜ ᵢₙ [₀,₁]

I(Y)=
∫₀¹Y(s) dW(s)

its a function that takes a simple process and gives a random variable

stochastic process
integrated wrt BM

operator with properties

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
28
Q

If I take two stochastic processes why is the sum a simple process
convince yourself

A

if we multiply the step with a real we also get a step

we can define uniquely from disjoint sets

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
29
Q

Considering:

Y(t) ²=
:=Σᵢ₌₁ⁿ Yₜ_ᵢ² 1_[tᵢ₊₁,tᵢ]

A

Considering:

Y(t) ²=
:=Σᵢ₌₁ⁿ Yₜ_ᵢ² 1_[tᵢ₊₁,tᵢ]

this is because the intervals are pairwise disjoint so only i=j are 1

if I fix omega this is just a step function

so for each omega i can find the Riemann integral

∫₀¹Y(s) ² dW(s)
= Σᵢ₌₁ⁿ Yₜ_ᵢ² (tᵢ₊₁-tᵢ)

which is again a RV

then I take the expectation.
Note that:
E[∫₀¹Y(s) .dW(s))²]
=
‖∫₀¹|Y(s) dW(s)‖² L₂
=
‖ (Y(s))_t i [0,1] ‖²_L₂(Ωx[0,1])

RHS:
= E[ ∫₀¹|Y (s)|² ds]=
∫_Ω ∫₀¹(Y (s))² ds dP

As the integral operator takes a stochastic process and gives a RV, norm of the size of object same as norm of the image (ito’s isometry)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
30
Q

For Y ∈ H_T ˢᵗᵉᵖ we have
(1)
E[|I(Y )|²]
= E[ ∫₀T|Y (s)|² ds]
(2)
E[I(Y )] = 0.

LEMMA PROOF
(1)

A

PROOF:
1)

Y ∈ H_T ˢᵗᵉᵖ
so
Y(t) =
:=Σᵢ₌₁ⁿ Yₜ_ᵢ 1_[tᵢ₊₁,tᵢ]

where Yₜ_ᵢ is F_t measurable and bounded

Consider E[(∫₀¹Y (s) dW(s) )²]=
E[ (Σᵢ₌₁ⁿ Yₜ_ᵢ(W(tᵢ₊₁)-W(tᵢ)) ²]
=
E[Σᵢ,ⱼ₌₁ⁿ
Yₜ_ᵢ(W(tᵢ₊₁)-W(tᵢ)) Yₜ_ⱼ(W(tⱼ₊₁)-W(tⱼ))]
= (diagonal elements taken out)
E[Σᵢⁿ
Y²ₜ_ᵢ(W(tᵢ₊₁)-W(tᵢ))²
+ Σᵢ≠ⱼⁿ 2Yₜ_ᵢ(W(tᵢ₊₁)-W(tᵢ)) Yₜ_ⱼ(W(tⱼ₊₁)-W(tⱼ))]
=
E[Σᵢⁿ
Y²ₜ_ᵢ(W(tᵢ₊₁)-W(tᵢ))²
+
E[ Σᵢ≠ⱼ 2Yₜ_ᵢ(W(tᵢ₊₁)-W(tᵢ)) Yₜ_ⱼ(W(tⱼ₊₁)-W(tⱼ))]

For i≠j:
WLOG i<j

(considering intervals
maybe they overlap at t_i+i but increasing sequence of partitions
t_1 <=t_2<=….)

Consider
E[ Yₜ_ᵢ(W(tᵢ₊₁)-W(tᵢ)) Yₜ_ⱼ(W(tⱼ₊₁)-W(tⱼ))]

now
W(tᵢ₊₁)-W(tᵢ) is F_tᵢ₊₁ measurable
Yₜ_ᵢ(W(tᵢ₊₁)-W(tᵢ)) is also as Yₜ_ᵢ is
(he said they are also tⱼ measurable???)

W(tⱼ₊₁)-W(tⱼ) is F_tⱼ₊₁ and hence also F_t_j?? measurable
so we use conditional expectation

E[ Yₜ_ᵢ(W(tᵢ₊₁)-W(tᵢ)) Yₜ_ⱼ(W(tⱼ₊₁)-W(tⱼ))]=
E[E[Yₜ_ᵢ(W(tᵢ₊₁)-W(tᵢ)) Yₜ_ⱼ(W(tⱼ₊₁)-W(tⱼ))|F_tⱼ]]

But Yₜ_ᵢ(W(tᵢ₊₁)-W(tᵢ) Yₜ_ⱼ is F_t_j measurable

= E[Yₜ_ᵢ(W(tᵢ₊₁)-W(tᵢ)) Yₜ_ⱼE[(W(tⱼ₊₁)-W(tⱼ))|F_tⱼ]]
W is a BM w.rt filtration F
so increments are indpendent so cond exp= exp which is 0 meaning this is 0
considering
now
E[Σᵢⁿ
Y²ₜ_ᵢ(W(tᵢ₊₁)-W(tᵢ))² ]
as E[Σᵢⁿ
Yₜ_ᵢ are F_t_i measurable and the increment (W(tᵢ₊₁)-W(tᵢ))² is independent of F_t_i
so
=
E[E[Σᵢⁿ
Y²ₜ_ᵢ(W(tᵢ₊₁)-W(tᵢ))²|F_tᵢ]
]
linearity of expectation we can take the sum out

=E[Σᵢⁿ E[
Y²ₜ_ᵢ(W(tᵢ₊₁)-W(tᵢ))²|F_tᵢ]
]

now Y²ₜ_ᵢis F_t_i measurable so I can take outside of conditional expectation

=E[ΣᵢⁿY²ₜ_ᵢ E[
(W(tᵢ₊₁)-W(tᵢ))²|F_tᵢ]
]

I have an increment of BM which is independent to F_t_i so conditional expectation is the usual expectation

=E[ΣᵢⁿY²ₜ_ᵢ (tᵢ₊₁-tᵢ)|F_tᵢ]]

(important to follow this proof and how we use properties hint hint)
which is the same as the riemann integral for
=E[∫₀¹ Y²(s).ds]

we have shown (1)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
31
Q

Filtration later times?

A

In the context of a filtration F on a probability space, if something is

F_tᵢ -measurable, then it is also F_tⱼ -measurable for
tᵢ≤tⱼ

The intuition behind this is that

F_tᵢ represents all the information available up to time tᵢ , and if something is measurable with respect to this information, then it’s also measurable with respect to all the information available at a later time tⱼ , where
tⱼ≥tᵢ

In other words, as time progresses, more information becomes available, so anything measurable at
t_ᵢ will also be measurable at any later time tⱼ.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
32
Q

DEF 3.2.5
I(Y, t)

A

For Y ∈ H_T ˢᵗᵉᵖ
let us define for t ∈ [0, T]
I(Y, t) =
∫₀ᵗ Y (s) dW(s) := I(Y 1_[0,t))

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
33
Q

Lemma 3.2.6.
The process (I(Y, t))_t∈[0,T]
is a …

A

The process (I(Y, t))t∈[0,T]
(integral
[0,t] Y(s) dW(s))

is a continuous martingale with respect to F.

proof: see notes

(F-wiener process, filtration (W(t))_ t in [0,T]

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
34
Q

Lemma 3.2.7 for Y ∈ H_T ˢᵗᵉᵖ

E[] ≤ 4E[]

A

missed out, seen similar in a proof?
Y ∈ H_T ˢᵗᵉᵖ we have
E[ sup_ₜ∈[₀,ₜ] |I(Y,t)|² ]
≤ 4E[ ∫₀T|Y (s)|² ds

proof:

This is a direct consequence of Doob’s inequality. Indeed, since the stochastic integral is a continuous martingale, we have by Theorem 2.1.12 and Lemma 3.2.4

E[ sup_ₜ∈[₀,ₜ] |I(Y,t)|² ]
≤ 4E[|I(Y,T)|²]
= 4E[ ∫₀T|Y (s)|² ds

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
35
Q

remark ito’s isometry

A

say we take a linear function on x in reals
T(x)
T:R to R

if its linear form ax

|T(x)|=|x| in our case
isometry
crucial for our stochastic integral defn

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
36
Q

For Y ∈ H_T ˢᵗᵉᵖ we have
(1)
E[|I(Y )|²]
= E[ ∫₀T|Y (s)|² ds]
(2)
E[I(Y )] = 0.

LEMMA PROOF
(2)

A

E[∫₀¹ Y (s) dW(s)]

E[Σᵢⁿ Yₜ_ᵢ(W(tᵢ₊₁)-W(tᵢ)]
=
Σᵢⁿ E[Yₜ_ᵢ(W(tᵢ₊₁)-W(tᵢ)]
=
Σᵢⁿ E[E[(Yₜ_ᵢ(W(tᵢ₊₁)-W(tᵢ)) | Fₜ_ᵢ]]
=
ΣᵢⁿE[Yₜ_ᵢ E[((W(tᵢ₊₁)-W(tᵢ)) | Fₜ_ᵢ]]
=
ΣᵢⁿE[Yₜ_ᵢ E[((W(tᵢ₊₁)-W(tᵢ))]]
= 0 as E[((W(tᵢ₊₁)-W(tᵢ)) =0

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
37
Q

IN SUMMARY: ito integral

A

If I take a simple stochastic process
I can define the stochastic integral
which is a random var
with
mean 0
finite second moment
which is given by the integral of the square of the process

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
38
Q

if Y ∈ Hˢᵗᵉᵖ

A

If Y ∈ Hˢᵗᵉᵖ :
then Integral
∫₀¹Y (s) dW(s) ∈ L₂(Ω)

norm
‖∫₀¹ Y(s) dW(s) ‖² L₂
=
E[(∫₀¹Y (s) dW(s) )² ]
=
E[∫₀¹Y(s)² ds ]
and

E[∫₀¹Y (s) dW(s) ]=0

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
39
Q

Define
Hˢᵗᵉᵖ⊆ H

A

GIVEN WITHOUT THE PROOF

H= {(Y(t))_t∈[0,1] |Y is adapted and E[∫₀¹Y(s)² ds <∞ }

(1)
take all stochastic processes Y that are adapted to a filtration

ie that for all y ∈ [0,1] its F_t measurable

ie. a RV Y(t) only depends on the values of brownian motion at time t

ie the sigma-algebra generated by BM
σ(W(s), s ≤ t)

(2)
satisfy that exp of square integrated = is finite
we can think of it as the Riemann integral. Is it well defined, its a lebesgue integral but we consider it as one (dw). So these are not simple anymore

H doesnt look like a step function

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
40
Q

Definition 3.2.8.

We say that a stochastic process Y : Ω × [0, T] → R is in the class H_T , if

A

We say that a stochastic process Y : Ω × [0, T] → R is in the class H_T , if it is
adapted to the filtration F and it satisfies

E[∫₀T |Y (s)|²ds]
< ∞.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
41
Q

suppose we are allowed to observe the path of BM up to time t. Then you would determine the value of Y_t. So if we had access to all values of the path up to time t it wouldn’t be a RANDOM var anymore!

σ(W(s), s ≤ t)

A

Y_t conditionally on F_t is not a RV but behaves like a constant. This is what adaptedness is.

So if we only consider those that are adapted we have this

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
42
Q

Itô’s Isometry

A

E[|I(Y )|²]
= E[ ∫₀T|Y (s)|² ds]

because it shows that if we see H_Tˢᵗᵉᵖ as a
subspace of L_2(Ω × R+), then the stochastic integral

I : H_Tˢᵗᵉᵖ → L_2(Ω) is an isometry.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
43
Q

Proposition ???
Let Hˢᵗᵉᵖ⊆ H

sequences of simple processes s.t sequence approximates Y for expectation

A

Let (Y(t))t∈[0,1] ∈H

Then there exists a sequence Yₙ(t))t∈[0,1]∈ Hˢᵗᵉᵖ for n ∈N

s.t.
limit n to infinity

E[∫₀¹ (Yₙ(s)-Y(s))² .ds]=0

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
44
Q

Let Y∈H

A

Let Y∈H
(That means Y is a:
stochastic process which is adapted and
doesnt depend on values of BM from the future, only past
and it has a finite expectation of integral of square)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
45
Q

Lemma 3.2.9.

H_T ˢᵗᵉᵖ subset of H

A

If Y ∈H_T
then there exists
Yₙ ∈ H_T ˢᵗᵉᵖ, for n ∈ N such that

lim_{n→∞}
E[
∫₀T |Y (s) − Yₙ(s)|²ds]
= 0

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
46
Q

Proposition 3.2.10.

A

Let Y ∈ H and let (Yₙ)ₙ₌₁∞ ⊂ H_T ˢᵗᵉᵖ be the sequence from Lemma 3.2.9.

Then there exists a random variable Z ∈ L_2(Ω). such that lim_{n to ∞} ‖I(Yn) − Z‖ = 0.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
47
Q

Original aim was to…

A

define for function integral of funct
considering sequence of funct converging to this funct
and our integral will also converge to integral of funct

we don’t rigourously prove the lemma

For the family of functions we consider for a particular omega

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
48
Q

Suppose have a sequence of reals a_n
cauchy if

A

the limit of n,m to infinity of |a_n-a_m| tends to 0

if for all epsilon there exists N s.t for all n,m >= N |a_n-a_m| <epsilon

terms get closer

every cauchy sequence converges (in a MS,in the R)
thus there always exists a limit

this also occurs in the L_2 space

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
49
Q

If you take a sequence of RVs (X_n) which is cauchy

in L_2 space it means

A

Lim_{n,m tends to infinity} of
‖X_n-X_m‖_L₂

=
Lim_{n,m tends to infinity} of
(E[(X_n-X_m)²])¹/²

=0

Thus
Lim_{n,m tends to infinity} of
(E[(X_n-X_m)²]) =0

thus there exists a RV Z In L₂(Ω) s.t

‖Z - X_n‖_L₂

s.t
Lim n to infinity [
E[ (X_n-Z)²] =0

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
50
Q

Using this to construct a stochastic integral

A

Suppose we have a process in H:
Y ∈ H
Then by lemma, there exists
Yₙ ∈ H_T ˢᵗᵉᵖ (simple)
s.t
lim_{n→∞}
E[
∫₀¹|Y (s) − Yₙ(s)|²ds]
= 0
(this also implies its cauchy…)

I look at
∫₀¹ Yₙ(s)dW(s)
and we show it is a cauchy sequence…

Thus there exists
Z in
L₂(Ω)
s.t
E[ (∫₀¹ Yₙ(s)dW(s) -Z)²]

converges to 0 as n tends to infinity and this limit will be our value we use

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
51
Q

∫₀¹ Yₙ(s)dW(s)
and we show it is a cauchy sequence

A

∫₀¹ Yₙ(s)dW(s)

and we show it is a cauchy sequence

∫₀¹ Yₙ(s)dW(s) - ∫₀¹ Yₘ(s)dW(s)
(in L_2 norm??)

E[(∫₀¹ Yₙ(s)dW(s) - ∫₀¹ Yₘ(s)dW(s))²]
(stochastic integral is linear)
=
E[(∫₀¹ (Yₙ(s)-Yₘ(s))dW(s))²]
it is isometry, expectation of the square …
=E[(∫₀¹ (Yₙ(s)-Yₘ(s))²ds)]
by triangle inequality

4E[∫₀¹ (Yₙ(s)-Y(s))²ds]
+
4E[∫₀¹ (Y(s)-Yₘ(s))²ds]
both tend to 0
as n,m tends to infinity

means the sequence of stochastic integrals is cauchy
in L2 which is complete thus it converges

Limit is a RV Z

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
52
Q

SUMMARY LECTURE

A

(Ω,F,P)
filtration
F = {Fₜ}_ ₜ in [0,1]
BM
(W(t))_t in [0,1],
F-wiener process

our goal was to construct
stochastic integral
∫₀T Y(s) dW(s)
we started looking at simple processes
STEP 1

(Y(t))_{t∈ [0,1]} ∈ H_T ˢᵗᵉᵖ

Y(t) =Σᵢ₌₁ⁿ Yₜ_ᵢ 1_[tᵢ,tᵢ₊₁] (t)
THUS

∫₀T Y(s) dW(s):=
Σᵢ₌₁ⁿ Yₜ_ᵢ (W(tᵢ₊₁)-W(tᵢ,)) ∈ L₂(Ω)
which is a RV in L_2
which we saw had some properties

STEP 2
This was for a simple process, then we extended this to approximate
Consider class H_T , if it is
adapted to the filtration F and it satisfies

E[∫₀T |Y (s)|²ds]
< ∞.

If we take a Y in H_T then there exists a sequence of simple processes
Yₙ ∈ H_T ˢᵗᵉᵖ (simple)
s.t
lim_{n→∞}
E[
∫₀¹|Y (s) − Yₙ(s)|²ds]
= 0

we used this to construct the stochastic integral for Y
By seeing the integrals of sequence form a cauchy sequence (RVs) converging in L_2

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
53
Q

Exercise 3.2.11.

Show that Z does not depend on the approximating sequence (Yₙ)ₙ₌₁∞. That is,
show that if (Y˜ₙ)ₙ₌₁∞ ⊂
is another sequence satisfying (3.9), then

lim_{n→∞} ‖ I(Y˜ₙ) − Z‖_L_2 = 0

A

.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
54
Q

SUMMARY

∫₀T Y(s) dW(s):=
Σᵢ₌₁ⁿ Yₜ_ᵢ (W(tᵢ₊₁)-W(tᵢ,)) ∈ L₂(Ω)
which is a RV in L_2

which we saw had some properties:

∫₀T Y(s) dW(s)

PROPERTIES FOR Y∈ H_T ˢᵗᵉᵖ

A

linear operator,
if Y₁, Y₂ ∈ Hₜ and a₁, a₂ ∈ R, then we have that
I(a₁Y₁ + a₂Y₂) = a₁I(Y₁) + a₂I(Y₂)

Exp = 0
E[∫₀T Y(s) dW(s)]=0

var =
E[ (∫₀T Y(s) dW(s))² ]
= E[ ∫₀T Y(s)² d(s)]

we will see
(∫₀ᵗ Y(s) dW(s))_(t in [0,T])
is a martingale

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
55
Q

Take a stochastic process
Y∈ H_T then PROPERTIES

A

linear operator,
if Y₁, Y₂ ∈ Hₜ and a₁, a₂ ∈ R, then we have that
I(a₁Y₁ + a₂Y₂) = a₁I(Y₁) + a₂I(Y₂)

Exp = 0
E[∫₀T Y(s) dW(s)]=0

var =
E[ (∫₀T Y(s) dW(s))² ]
= E[ ∫₀T Y(s)² d(s)]

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
56
Q

If we
(Y_n(t))_ t in [0,1] in Hₜˢᵗᵉᵖ

I: Hₜˢᵗᵉᵖ → L₂(Ω)

(Yₙ(t))₎ₜ ᵢₙ [₀,₁]

I(Y)=
∫₀¹Y(s) dW(s)…

A

(Yₙ(t))₎ₜ ᵢₙ [₀,₁]

I(Y)=
∫₀TY(s) dW(s)

(stochastic process)

(random variable)

I want to consider now (for little t)

∫₀ᵗ Y(s) dW(s)
for each t i have a RV
so I consider a family of RVs

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
57
Q

Consider
∫₀ᵗ Y(s) dW(s)
for each t i have a RV
so I consider

(∫₀ᵗ Y(s) dW(s))_ t in [0,T]

A

a family of RVs
which is a stochastic process

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
58
Q

TRUE OR FALSE?
Take Y∈ H_T then
1_[0,t] Y ∈ H_T
STOCHASTIC PROCESS

a new stochastic process

A

TRUE BUT if i take a stochastic process

(Yₙ(r))₎_r ᵢₙ [₀,T] ∈ H_T

then
I have stochastic process
(1[0,t] Y(r)) r ᵢₙ [₀,T] ∈ H_T

Note:
This new stochastic process is equal to the stochastic process Y when r is in [0,t]

if r>t then its 0

=
{Y(r) if r <=t
{0 r>t

So trajectories in this case look like:
From a simple process which is (continuous on the left)

becomes
the same but only for time <=t

  • because it is ∈ H_T we can integrate and value

∫₀ᵗ Y(s) dW(s) =
∫₀T 1_[0,t] (s) Y(s) dW(s)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
59
Q

Consider stochastic process

(∫₀ᵗ Y(s) dW(s))_ t in [0,1]
PROPERTIES

A

(1) This stochastic process is F adapted:

So for any time t the stochastic integral,
∫₀ᵗ Y(s) dW(s)
is F_t measurable

(2) Continuous
P(t → ∫₀ᵗ Y(s) dW(s) is continuous)= 1

So always cont funct
for each t its a rv
for each omega its a function of time and it turns out this is continuous

60
Q

RECAP martingale

A

Consider (Ω,F,P)
filtration
F = {Fₜ}_ ₜ in [0,T]

then stochastic process
(M(t))ₜ in [0,T]

is a martingale w.r.t filtration F if

(1)M(t) is F_t measurable for all t in [0,T] (ADAPTED)

(2) INTEGRABLE in(moment 1?), finite
E[|M(t)|] < infinity

(3) E[M(t)|F_S]=M(s)??
(for any time t, the conditional expectation of the next value given all past values up to time t is equal to the value at time t)
E[Xₜ₊₁|X₁,X₂…Xₜ]=Xₜ

third property is important:

61
Q

Third: 3) E[M(t)|F_S]=M(s)

meaning property is important: martingales

A

Considering conditional expectations: projection geometric interpretation

projection of M(t) in the space of F_S measurable Random variables

F_S measurable variables are random variables which at time s aren’t random variables anymore

This condition tells us that if t is in the future and s is the present

the projection of M(t) to the present (projection is the best possible approximation we can get), the approximation of RV in the future is best estimate is the value is the value in the present M(s). Iterating this shows it would be constant?

62
Q

Third property of a martingale

A

Let Mₜ be a martingale with respect to a filtration {Fₜ}ₜ≥₀

let S be a stopping time.
Conditional exp:
E[M(t)∣Fₛ ]=M(S)

This property states that the conditional expectation of M(t) given the information up to time S (i.e., the sigma-algebra F_S ) is equal to the value of the martingale
M at time S.

This property essentially means that at any stopping time S, the current value of the martingale M provides the best estimate for its future value. In other words, knowing the current value of the martingale M provides no additional information about its future evolution beyond what is already known up to time S

63
Q

If we have stoch process X(t) which is F adaped ( X(t) is F_t measurable)

Think of all F_t measurable RVs such that at time t aren’t random variables anymore

A

E.g if X is the price of a stock

I claim that the price of the stock is adapted to filtration F_t measurable

then it means if t is the present then the value is not random
(i know its current price)

tomorrows price is random,
but waiting for a day its not random anymore, This is what adapted means.

If i make a conjecture about stock tomorrow, best option is to go for the price of stock today

64
Q

If we took the expectation of both sides of martingale
E[M(t)|F_S]=M(s)

A

E[E[M(t)|F_S]]=E[M(s)]
implies
E[M(t)]=E[M(s)]
constant as i can choose s=0

as cond exp becomes exp

65
Q

The stochastic integral is a martingale? TRUE OR FALSE

A

TRUE

(∫₀ᵗ Y(s) dW(s))_ t in [0,T] is an F-martingale

66
Q

BROWNIAN MOTION is a martingale? TRUE OR FALSE

A

TRUE

(∫₀ᵗ Y(s) dW(s))_ t in [0,T] is an F-martingale

using Y=1 stochastic integral becomes
∫₀ᵗ dW(s)
=∫₀T 1_[0,t] (s)dW(s))
W(t) - W(0) = W(t)

67
Q

PROOF OF:
LEMMA
(∫₀ᵗ Y(s) dW(s))_ t in [0,T] is an F-martingale

A

We only consider Y in H_Tˢᵗᵉᵖ for now, its true for H_T: Checking the three properties of a martingale

We know by defn:
Y(r)= Σᵢ₌₁ⁿ Yₜ_ᵢ 1_[tᵢ,tᵢ₊₁] (r)
where Yₜ_ᵢare Fₜ_ᵢ-measurable (*)
(1)
Stochastic integral

∫₀ᵗ Y(s) dW(s))
=∫₀T 1_[0,t] (r) Y(r) dW(s))

multiplying (*) by
1_0,t
gives simple process, which we know how to integrate

Y(t)10,t = Σᵢ₌₁ⁿ Yₜ_ᵢ 1[tᵢ^t,tᵢ₊₁^t] (r)

∫₀t Y(s) dW(s):=
∫₀T Y(t)1_0,t . dW(s)
=
Σᵢ₌₁ⁿ Yₜ_ᵢ (W(tᵢ₊₁^t)-W(tᵢ^t))
Which is a stochastic process dep on t

We want to show this is a martingale: the finite sum of martingales is a martingale so we show:
Yₜ_ᵢ (W(tᵢ₊₁^t)-W(tᵢ^t))
(1) is F_t measurable for all t in [0,T] (ADAPTED):
W is a BM adapted to filtration
Yₜ_ᵢ is Fₜ_ᵢ-measurable, I want it to be F_t measurable for t<tᵢ:
But why would I know the value if its value depends on time tᵢ in the future?
tᵢ^t will be t, tᵢ₊₁^t is t
=0
0 is a constant
so it is F_t measurable
t>tᵢ:
Yₜ_ᵢ is Fₜ_ᵢ-measurable but now Fₜ_ᵢ⊂Fₜ
W(tᵢ^t) will be Fₜᵢ^ₜ-measurable
W(tᵢ₊₁^t) will be Fₜᵢ₊₁^ₜ-measurable

Fₜᵢ^ₜ⊂Fₜ
Fₜᵢ₊₁^ₜ ⊂Fₜ
So all terms are Fₜ-measurable( subtract and multiply still Fₜ-measurable)

(2)
E[| Yₜ_ᵢ (W(tᵢ₊₁^t)-W(tᵢ^t)) |] < infinity?
E[| Yₜ_ᵢ (W(tᵢ₊₁^t)-W(tᵢ^t)) |]
Yₜ_ᵢ uniformly bounded, finite in L_infinity by defn so we can take it out
≤ ‖Yₜ_ᵢ‖L∞ E[| (W(tᵢ₊₁^t)-W(tᵢ^t)) |]
by holders inequality(exercise for norm p<q….?)
< ‖Yₜ_ᵢ‖L∞ E[| (W(tᵢ₊₁^t)-W(tᵢ^t)) |²]¹/²
but this is the variance
=‖Yₜ_ᵢ‖L∞ (tᵢ₊₁^t - tᵢ^t)¹/²

(3) E[Yₜ_ᵢ (W(tᵢ₊₁^t)-W(tᵢ^t)) |F_S]=Yₜ_ᵢ (W(tᵢ₊₁^t?)-W(tᵢ^t?))
t always greater than s
CASE 1: s<t<t₁<t₂
CASE 2:s<t₁<t<t₂
CASE 3: t₁<s<t<t₂

68
Q

multiplying simple process by characteristic function

A

Y(r)= Σᵢ₌₁ⁿ Yₜ_ᵢ 1_[tᵢ,tᵢ₊₁] (r)
(*)

multiplying (*) by
10,t
gives
Y(t)1
0,t = Σᵢ₌₁ⁿ Yₜ_ᵢ 1[tᵢ,tᵢ₊₁] (t)10,t
as the product of two characteristic functions is the characteristic function of the
intersection(min ^):
Y(t)10,t = Σᵢ₌₁ⁿ Yₜ_ᵢ 1[tᵢ^t,tᵢ₊₁^t] (r)

which is a simple process

69
Q

measurable functions

A

sum
subtract
composition
multiplication still measurable

70
Q

Notes not exactly matching lectures, review when write up for this part

A

:)
<3

71
Q

Theorem 3.2.15
statements about
Y, Z ∈ H_T and a, b ∈ R

A

Let Y, Z ∈ H_T and a, b ∈ R
i) almost surely
.
for all t ∈ [0, T]

(ii) For all t ∈ [0, T]
E[] =0

(iii) (Itô’s isometry) For all t ∈ [0, T]
E[]=E[]

(iv) (I(Y, t))t≥0 is a martingale with respect to F.
(v) Doob’s inequality holds,

72
Q

SHOW that
(3) E[Yₜ_ᵢ (W(tᵢ₊₁^t)-W(tᵢ^t)) |F_S]=Yₜ_ᵢ (W(tᵢ₊₁^s)-W(tᵢ^s))
t always greater than s
CASE 1: s<t<t₁<t₂
CASE 2:s<t₁<t<t₂
CASE 3: t₁<s<t<t₂

A

E[m(t)|F_s]

Define m(t)= Yₜ_ᵢ (W(tᵢ₊₁^t)-W(tᵢ^t)

CASE 1: s<t<t₁<t₂
W(tᵢ₊₁^s)-W(tᵢ^s)) =0
m(s)= 0
trivially satisfied
E[m(t)|F_s]=
E[0 |F_S]=0=m(s)

CASE 2:s<t₁<t<t₂
W(tᵢ₊₁^s)-W(tᵢ^s)) =0
W(tᵢ₊₁^t)-W(tᵢ^t)) =W(t)-W(t₁)
m(s)=0
m(t)= Y_t₁( W(t)-W(t₁))
E[m(t)|F_s]=
E[Y_t₁( W(t)-W(t₁)) |F_s]
using tower property of expectation s<t₁ so F_s ⊆Ft₁
=E[E[Y_t₁( W(t)-W(t₁))|F_t₁] |F_s]
as Y_t₁ F_t₁(-measurable
=
E[Y_t₁E[( W(t)-W(t₁))|F_t₁] |F_s]
W(t)-W(t₁)) indep of F_t₁ so cond exp is just exp
=0=m(s)

CASE 3: t₁<s<t<t₂
W(tᵢ₊₁^t)-W(tᵢ^t)) =W(t^t₂)-W(t₁)
m(t)= Y_t₁(W(t)-W(t₁)
m(s)=Y_t₁(W(s^t₂)-W(s)

E[Y_t₁(W(t^t₂)-W(t₁) |F_s]
s is larger than time t
Y_t₁ is F_t₁ measurable
F_t₁⊆ Fₛ
Y_t₁ is Fₛ measurable
so i can take it out of the conditional expectation
=
Y_t₁ E[(W(t^t₂)-W(t₁) |F_s]
=
Y_t₁ (E[(W(t^t₂)|Fₛ]-E[W(t₁) |Fₛ]
t_1 <s so E[W(t₁) |Fₛ]=W(s)
we need to show
E[(W(t^t₂)|Fₛ] =W(s^t₂)
because BM is a martingale

73
Q

If we take a martingale
M(t)
look at
m(t^t’)
positive time

A

is also a martingale
t’ fixed
t is random

74
Q

Doob’s inequality

A

Y ∈ H_T, then
E[sup_{T ∈ T} |∫₀ᵗ Y(s).dW(s)|²]
≤4E[∫₀T Y(s)².ds]

(without the supremum and with terminal time capital T and without 4 we would have the isometry,

but instead we take the supremum of all times and we have this bound)

75
Q

stochastic integral
linear operator?
martingale?

A

True

we saw only in class H-step

76
Q

stochastic integral:
Given a H and integrate w.r.t W we have

A

random variable which has 0 mean
and
variance = square of the expectation of the integral

77
Q

Consider…filtration
F={Ƒₜ}ₜ∈[₀,T]
generated by BM
Ƒₜ = σ(W⁻¹(A), A∈B(R))

A

filtration
F={Ƒₜ}ₜ∈[₀,T]
generated by BM
Ƒₜ = σ(W⁻¹(A), A∈B(R))
inverse image of Borel sets

If a random variable is measurable w.r.t. F_t is means that if we know all the values of BM W(s) for all s up to time t then in principle we know values so no longer random var

78
Q

If M(t) is a martingale wrt filtration F

F={Ƒₜ}ₜ∈[₀,T]
generated by BM
Ƒₜ = σ(W⁻¹(A), A∈B(R))

A

If M(t) is an F-Martingale continuous s.t
E|M(t)|²] <∞

then there exists a unique process in the class Y ∈ H_T
s,t,
M(t)=
M(s) + ∫₀ᵗ Y(s) dW(s))

79
Q

stochastic integrals and martingales

A

every stochastic integral is a martingale

If we have a martingale which has the property which is continuous in time and square integrable then we can define integral Y s.t M(t) is the stochastic integral of Y wrt. W

80
Q

Theorem 3.2.17
Martingale representation thm

A

Let G be the augmentation of the natural filtration of W.
Let (M(t))_t≤T be a continuous, martingale with respect to G such that M(T) ∈
L_2(Ω).

Then, there exists a unique (Y (t))_t≤T ∈ H_T , adapted to G,
s.t
for each t ∈ [0, T], with probability one

M(t)=
M₀ + ∫₀ᵗ Y(s) dW(s))

81
Q

Exercise 3.2.18. Let Z be a random variable with E[|Z|] < ∞.

Show that M(t) := E[Z|Fₜ] is a
martingale with respect to (Fₜ)ₜ≤T .

A

If we take a RV Z which is integrable and any filtration Fₜ.
(1) adapted to filtration: M(t) is Fₜ-measurable
E[Z|Fₜ] by defn as conditional on Fₜ

(2)integrable:
E[|E[Z|Fₜ]|]
E[E[|Z||Fₜ])]
=E[|Z|]
<∞
which we know is finite
(you should know this!!!: convex functions and conditional expectations
|E[Z|Fₜ]|≤ E[|z||Fₜ])
We also know E[E[X|G]]=E[X]

(3)

82
Q

E[E[X|G]]=E[X]

A

conditional expectation def:
cond expectation is a g-measurable function
and
E[1_AX]=E[1_A E[X|G]]
for all A in G

so we choose an A s.t this works:
A=Ω

83
Q

we started with H^STEP

A

we extended to class H
now I want to extend to a larger class S

84
Q

Definition 3.4.1.
STOPPING TIME

A

A map 𝜏 : Ω → [0, +∞] is called stopping time with respect to a given filtration F if
for all t ∈ [0, T] we have {𝜏 > t} ∈ Fₜ

  • 𝜏 is a RV which takes non-neg values
    *it has the property that for event
    {𝜏 > t}= {ω∈Ω: 𝜏(ω)>t}∈ Fₜ for all t ≥0

so RVs 𝜏 for which map as above
for which the event that {𝜏>t} is in Fₜ for all t ≥0
called stopping time related to gambling: strategy to get out when do we stop!

85
Q

stopping time diagram

meaning of conditions

A

If F_tᵢ is flow of information

F_s⊆ F_t
—-s———-t—–>

time line: increasing flow of information

recall that a RV that is F_t at time t is not longer random

event
{𝜏 > t}= {ω∈Ω: 𝜏(ω)>t}∈ Fₜ for all t ≥0 means that
* event is Fₜ measurable
*and at time t we know whether 𝜏 > t or not as event ∈ Fₜ

*say tau describes a random time an event happens: we want to know at time t if the event has happened and event {𝜏 > t} means it has not happened yet up to time t!
*describes mathematically only by looking at information from the past I want to determine whether this event corresponding to time tau has happened or not

86
Q

Filtrations measurable before or after

A

F_tᵢ -measurable, then it is also F_tⱼ -measurable for tᵢ≤tⱼ

When we say X is F_tᵢ -measurable, it means that
X is determined by the information available at time tᵢ . So, X is independent of any information that becomes available after time tᵢ
.

Now, if tᵢ ≤t_jt i ≤t_j , then the sigma-algebra
F_tᵢ is a subset of F_t_j
, meaning that the information available at time
tᵢ is also available at time t_j​ . Therefore, any random variable that is F_tᵢ -measurable is also F_t_j -measurable for t_i≤t_j

In other words, if X depends only on information available up to time tᵢ , then it automatically depends on information available up to time t_j whenever tᵢ ≤t_j

.

87
Q

Suppose we have continuous process (X(t)){t∈[0,T]} adapted to filtration (F_t){t∈[0,T]}:

Exercise 3.4.2. Let X : Ω × [0, T] → R be a continuous process adapted to the filtration F. Show that
τ_b := inf{t ∈ [0, T] : X(t) ≥ b},
where recall that inf ∅ = +∞.

Show that τ_b is a stopping time

A

diagram: y=b level stochastic process X path may cross b
map t to X(t) as a function of time (0, ifninity)

I look at set for which X(t)>b
{t ∈ [0, T] : X(t) ≥ b}
take infimum: first time that stochastic process hits the level b is τ_b(w_1)
τ_b := inf{t ∈ [0, T] : X(t) ≥ b},…..

Now this is random as if I have a different path the w will be different( τ_b(w_2)
I CLAIM ITS A STOPPING TIME:(i suggest you check on your own, dont want to spend too much time)
in general if you have ADAPTED PROCESSES and you have STOPPING TIME of this form (first time process hits a level) underlying process is adapted and so its a stopping time:

We need to show: for all t ∈ [0, T] that {𝜏 > t} ∈ Fₜ:
(the largest value X can take is something strictly less than b on interval [0,t] for t less than 𝜏)
{𝜏 > t}={sup_{s≤t} {X(s)} <b}
since X is continuous we know set/supremum can be
={{sup}{s∈Q,s≤t} X(S) <b}
the rationals are countable so use countsable intersection
= ∩
{s∈Q} {X(s) <b}
my process is adapted as X is adapted sets are ⊂F_s ⊂F_t

Since X is adapted, X(s) is F_s-measurable. Since F_s ⊂ F_t for s ≤ t, we get that for each
s ∈ [0, t] ∩ Q, X(s) is F_t-measurable. Since [0, t] ∩ Q is a countable set, by Exercise 1.1.13 we get that sup_{s∈[0,t]∩Q} X(s) is Ft-measurable, which implies that {sup_{s∈[0,t]∩Q} X(s) < b} ∈ F_t
Consequently, {τb > t} ∈ F_t. Hence τ_b is indeed an F-stopping time.

An example of why in gambling you might tell yourself the first time you hit £100 I get out of here, the most probable is that you lose all your money (dep on how much you start with) tau can take value +infinity, you have to wait that long?

88
Q

Exercise 3.4.3. Let τ1, τ2 be stopping times with respect to the filtration F. Show that τ1 ∧ τ2 is
also a stopping time with respect to the filtration F.

A

MINIMUM OF TWO STOPPING TIMES IS A STOPPING TIME

not discussed:
Let 𝜏₁ and 𝜏₂ be stopping times WRT filtration F
then
{𝜏₁ >t} {𝜏₂>t} ∈ F_t ∀t ∈ [0, T].

{𝜏₁ ∧ 𝜏₂ > t} = {𝜏₁ > t and 𝜏₂ > t}
= {𝜏₁ > t} ∩ {𝜏₂ > t}
which is in F_t , because σ-algebras are closed under (countable and thus also finite) intersection.

We have shown that {τ1 ∧ τ2 > t} ∈ F_t for all t ∈ [0, T]. Therefore τ1 ∧ τ2 is indeed an
F-stopping time

89
Q

Exercise 3.4.4. Let τ be a stopping time. Show that for each t ≥ 0 we have {𝜏 ≥ t} ∈ F_t Show that for all t ∈ [0, T], the random variable
1[0,𝜏) (t) and 10,𝜏 are F_t-measurable.

Conclude that they are both F-adapted stochastic processes

(to show that they are F ⊗ B([0, T])-measurable, recall Remark 2.1.2)

A

SPECIFICALLY GONE THROUGH?????
1_[0,𝜏(w)] (t) is a stochastic process (depends on w and on t)
looks like: 1 for t less than or equal to 𝜏(w), 0 for t>𝜏(w)

I claim its adapted and thus F_t measurable for all t>=0

fix t>=0:
1_[0,𝜏(w)] (t)=
{1 t ≤𝜏(w)
{0 t>𝜏

so to check if F_t measurable suffices to check: that the inverse image of the set (-infinity,a) as this generates a Borel sigma algebra.
checking {1[0,𝜏(w)] (T) >a} in F_t
through the inverse map
(1
[0,𝜏])⁻¹ (-∞,a)
can only take specific values
(1_[0,𝜏] (t))⁻¹(B)
= {ω ∈ Ω : 1[0,τ(ω))(t) ∈ B}
=
{Ω if 0, 1 ∈ B
{∅ if 0, 1 ∈/ B
{{t < τ} if 1 ∈ B, 0 ∈/ B
{{t < τ}^c if 0 ∈ B, 1 ∈/ B

Since Ft
is a σ-algebra: Ω, ∅ ∈ Ft
. Furthermore since τ is a stopping time: {τ > t} ∈ Ft . So since σ-algebras are closed under complementation, we also have that {τ > t}^c ∈ F_t. Therefore 1_[0,𝜏] (t))⁻¹(B) ∈ Ft ∀B ∈ B([0, ∞])

hence adapted? F_t measurable

some more info in exercise notes

90
Q

Exercise 3.4.5. Let X : Ω × [0, T] → R be a continuous and F-adapted process and take real
numbers a < b. Let τ := inf{t ∈ [0, T] : X(t) 6∈ (a, b)} (inf ∅ := ∞). Prove that τ is an
F-stopping time.
The stopping time from the exercise above it is call the first exit time of X(t) from (a, b).i

A

Solution: See Exercise 3.4.2 and Exercise 3.4.3.

missed?

The stopping time from the exercise above it is call the first exit time of X(t) from (a, b).

91
Q

Definition 3.5.1
Extending the stochastic integral

We extend to a class of processes

A

We know class
Hₜˢᵗᵉᵖ⊆ H_T ⊆S_T

We denote by S_T the class of all F-adapted processes Y : Ω × [0, T] → R such
that T ≥ 0, almost surely
∫_[0,T] |Y (s)|^2 ds < ∞.

S_T= {(Y){t ∈[0,T]} ¦ adapted and P(∫[0,T] |Y (s)|^2 ds < ∞)=1 }

This is a larger class than H_T, instead of the probability we want expectation of integrand to be finite

We want to extend the stochastic integral to integrands from the class S_T .

92
Q

Why do we have inclusion?
H_T ⊆S_T

A

taking process in H_T means expectation inf integrand finite, meaning almost surely ie probabilit finite =1

thus it is also a process in S_T

93
Q

take any RV X>0
E[X]< infinity implies P(X<infinity)=0?
TRUE OR FALSE

A

FALSE

take any RV X>0
E[X]< ∞
implies
P(X<∞)=1

If the probability that X is not finite>0 means that RV attains infinity with some positive probability which means the expectation has to be infinite

so if a expectation is finite then RV Is finite almost surely

94
Q

Theorem 3.5.2. Let Y ∈ HT and let τ be a stopping time bounded by T. Then, almost surely

A

Let Y ∈ H_T and let 𝜏 be a stopping time bounded by T. Then, almost surely
I(Y, τ ) = I(1[0,𝜏]Y ) = I(1_[0,𝜏)Y )
—————————————————-

95
Q

We want to extend the stochastic integral to integrands from the class S_T .

A

Y ∈ S_T and n ∈N

definite
𝜏_n = inf {t≥0 ¦ ∫_ [0,t] (Y(s))^2 .ds ≥ n}

infimum of times of which the integral crosses the level n
starts from 0, non negtive, increasing process
(sketch not jagged strictly increasing)

Y is adapted thus integral is adapted thus this 𝜏_n is a stopping time thus it is adapted

I claim Y_n belongs to H:
so we started with something in s and truncated by using indicator function
Y_n ∈ H_T
we have to check 2 things
1)adapted checked
the first time integral hits 𝜏_n
we know Y by def is adapted and therefore 𝜏_n is adapted and so Y_n is adapted

2) expectation of integral of Y_n ^2 is finite
which we check
E[ ∫₀T|Y (s)|² ds
=E [ ∫₀T 10,𝜏_n Y (s)|² ]
=E [ ∫
[0, t^𝜏_n ] Y (s)².dS ]
=E[n]
less than or equal than n
so finite

So we have shown belongs in H and we can take the stochastic integral

we have that ||cinverges to 0 in probability
more details in notes I won’t insist on that
not a martingale if in S is a local martingale

96
Q

1_[0,𝜏)(s)Y(s)???

If we take a process in H_T
and multiply it by a characteristic function of stopping time then product is also in H_t

A

(Y ∈ H_T means Y is adapted and that integration E[ ∫₀T|Y (s)|² ds is finite < infinity:)

(𝜏 stopping time means {t<𝜏} is in F_t

we checked already 1 is adapted
new process
1_[0,𝜏)(s)Y(s)
Y is adapted as by defn

thus
1[0,𝜏)(s)Y(s) is adapted and
1
[0,𝜏)(s)Y(s) is F_s measurable for all s>=0

E[ ∫₀T|1[0,𝜏)(s)Y(s) |² ds=
E[ ∫
{0, 𝜏^T} (Y(s))² ds
note upper limit with min{t,𝜏} is random, 𝜏 is random, cand bring in Expectation through to integral like if it was deterministic time , but as limit is random we can’t
but integrating something positive
≤ E[ ∫_{0, T} (Y(s))² ds <infinity
integrating over space less than 0,T

97
Q

TRUE OR FALSE
y∈ H_T
then
1_[0,𝜏] Y ∈ H_T

for stopping time 𝜏

A

true, shown prev

Since 1_[0,𝜏] Y ∈ H_T
it means i can take the STOCHASTIC INTEGRAL OF IT

So set M(t) = ∫[0,t]Y(s) dW(s)
Then we have
M(𝜏) = ∫
[0,T]1_0,𝜏 Y(s) dW(s)

DANGER
this is not obvious: true for all t but as upper limit random as 𝜏 random we wouldnt expect this

also this equals
∫_[0,𝜏] Y(s) dW(s)

(not in notes)

98
Q

Theorem 3.5.3

A

Let Y ∈ HT . Then, for all δ, ε > 0 we have
P

sup
t∈[0,T]
|
Z t
0
Y (s) dW(s)| ≥ ε

≤ P
Z T
0
|Y (s)|
2
ds ≥ δ

+
1
ε
2
E

δ ∧
Z T
0
|Y (s)|
2
ds
From this, one can prove Davis’s inequality

99
Q

Lemma 3.5.4

A


≤ 3E

For Y ∈ HT we have
E

sup
t∈[0,T]

Z t
0
Y (s) dW(s)

Z T
0
|Y (s)|
2
ds1/2

.

100
Q

Exercise 3.5.5. Use Theorem 3.5.3 in order to prove Lemma 3.5.4.
Hint: Use that E[X] =
R ∞
0
P(X > λ) dλ, for X ≥ 0.

A
101
Q
A
102
Q

Definition 3.5.7.
local martingale
localising sequence

A

A process X : Ω × [0, T] → R is a local martingale if there there exists a
sequence of stopping times (τ_n)∞
n=1 satisfying
(i) for any n ∈ N, P(τn ≤ τn+1) = 1
(ii) for all ω ∈ Ω, τ_n(ω) = T for all n large enough.
(iii) the for each n ∈ N, the process (X(t ∧ τ_n))t∈[0,T]
is a martingale.
Such a sequence of stopping times is called localising sequence for (X(t))t∈[0,T]
.
For Y ∈ ST it easy easy to see that the sequence of stopping times defined in (3.10) is a
localising sequence for (I(Y, t))t∈[0,T]
.

103
Q

Exercise 3.5.8. Let Y_n, Y ∈ S_T such that
∫_[0,T] |Y_n(s) − Y (s)|^2 ds → 0,
as n → ∞, in probability. Show that the stochastic integrals of Y_n converge to the stochastic
integral of Y , uniformly in time, in probability. That is, show that
sup
t∈[0,T]
→ 0
as n → ∞, in probability

[0,T] Y_n(s) dW(s) − ∫[0,T] Y (s) dW(s)|

A

only briefly discussed
We want to show that for any ε > 0, we have

limit as n tends to infinity of
P( sup_t≤T | ∫[0,t] Y_n(s) dW(s)- ∫[0,t] Y(s)dW(s) |≥ε ) =0

For this it suffices to show that for any ε > 0 and for any δ > 0,

lim sup_n to infinity
P( sup_{t≤T} | ∫[0,t] Y_n(s) dW(s)- ∫[0,t] Y(s)dW(s) |≥ε ) ≤δ.

carries on to apply…. notes

104
Q

recap:class H^STEP_T
properties
H_T

A

THM 3.2.15contained within H_t

H_T:all square integrable adapted processes

y∈Hₜˢᵗᵉᵖ has property that:
( ∫₀ᵗ Y(s)dW(s))_t∈[0,t] is a random variable
with
1) E[ ∫₀ᵗ Y(s)dW(s)]=0

2) E[ ∫₀ᵗ Y(s)dW(s)]^2 = E( ∫₀ᵗ (Y(s))^2 .ds < infinity (finite defn for y in H_T)

also it’s linear: the integral of elements from H_T can be split up as a sum of integrals
3) the process is a martingale

also extended this to a class S are adapted processes s.t the probability that the RV integral_[0,T] is finite with probability one, local martingale

Setting m(t)= integral_[0,t] Y(s)dW(s)

105
Q

y in H_T

continuity property

A

E[sup_{t<= T} |( ∫₀T Y(s) dW(s)|]
<= cE[ ∫₀T Y(s)^2 ds]

(if Y wasn’t in H_T would be infinite)
This implies that if I had Yⁿ converges to y in the sense E [ ∫₀T (Yⁿ(s)-Y(s))^2 ds] converges to 0 as n tends to infinity
THEN
stochastic integrals converge:
∫₀ᵗ Yⁿ(s)dW(s) converges to ∫₀ᵗ Y(s)dW(s)
in the sense
E [sup_(t ≤T) |∫₀ᵗ Y(s) dW(s) - ∫₀ᵗ Yⁿ(s)dW(s)|] converges to 0 as n tends to infinity

106
Q

if y in S_T
P(sup_{t ≤T}| ∫₀ᵗ Y(s)dW(s) |≥ ε)

Yⁿ converges to y in the sense ∫₀T (Yⁿ(s)-Y(s))ds] converges to 0 in probability as n tends to infinity

A

if y in S_T
P(sup_{t ≤T}| ∫₀ᵗ Y(s)dW(s) |≥ ε)≤P( ∫₀T (Y(s))^2 ds ≥δ) + δ/ε²

relates the stochastic integral and integral of the square

and so if
Yⁿ converges to y in the sense ∫₀T (Yⁿ(s)-Y(s))ds converges to 0 in probability as n tends to infinity

∫₀T (Yⁿ(s) dW(s) converges ∫₀T Y(s))dW(s)

in the sense sup_{t ≤T}| ∫₀T (Yⁿ(s)dW(s)- ∫₀T Y(s))dW(s)| tends to 0 in probability

107
Q

recap
Y in H_T
martingale?

A

If Y in H_T
then the process
m(t)= ∫₀ᵗ Y(s).dW(s)
is an F martingale

Y in S_T (m(t))_{t in [0,T]} is not a martingale in general

If E[( ∫₀T (Y(s))^2.ds)^0.5]< infinity
(finite) then (m(t))_{t in [0,T]} is a martingale

In general Y in S_T is a local martingale (m(t)){t in [0,T]} : there exists a sequence of stopping times
we used
Yⁿ(t)= 1
0,𝜏_n Y(t)
𝜏_n= inf{ ∫₀T ((Y(s))^2 . ds >n}
first time the stochastic process crosses the level n
gives Yⁿ in H_T
also Yⁿ(t)=Y(t) for t in [0,𝜏_n]
so
m(t ^ 𝜏_n) = ∫[0,t ^ 𝜏_n]Y(s) dW(s)
= ∫
[0,t ^ 𝜏_n]Yⁿ(s) dW(s)
but as this is the stochastic integral of somthin tht belongs in H_t then it is a martingale for each n

thus m(t)
is a local martingale

108
Q

Y in H_T vs Y in S

A

If Y is in S ONLY NOT IN H_T: not true

If Y in H_T :We know that the expectation of the absolute value is finite

E[| ∫₀ᵗY(s)|] < infinity

so given Y in H_T we use this condition

109
Q

two differentiable functions f and g, then the following chain rule

A

d/dt(g(f(t)))

g’(f(t))f’(t)

IF
f has differential
f(t) =f(0) +∫_[0,t] a(S) dS

g(f(t)) = g(f(0)) + ∫_[0,t] g’(f(s))s(s) ds
not involving the derivative if g

df(g=f’(g(t)?? .dt

110
Q

g(t) = g(0) + ∫_[0,t] a(s) ds

A

iff
g’(t) =a(t)

notation dg(t) =a(t).dt
dg/dt=a

111
Q

Itô’s process

A

probability space
(Ω,F,P)
filtration F={F_t}{t in [0,T]}
BM wrt this filtration F-Wiener process
(W(t))
{t in [0,T]}

Stochastic process (X(t)){t in [0,T]} is an ito process if there exists processes
(A(t))
{t in [0,T]} in D_T
and
(B(t))_{t in [0,T]} in S_T (stochastic integrals s.t the integral is finite)

X(t) =X(0)+ ∫[0,t] A(s).ds + ∫[0,t] B(s) dW(s)

when B =0 the function is differentiable wrt t

112
Q

when we have an Ito process of this form

ito’s differential

A

X has Ito differential or stochastic differential

dX(t) = A(t)dt + B(t) dW(t)

just notation short hand for the equation

113
Q

Theorem 3.6.1 (Itô’s formula in d = 1 (notes compare))

summarised next lecture too!!

COMPOSITION
OF ITO PROCESS AND
function f

A

Let (X(t))_{t in [0,T]} be an Itô process
A in D_T
B in S_T

X(t) =X(0)+ ∫[0,t] A(s).ds + ∫[0,t] B(s) dW(s)

Let f be in c^2 (twice differentiable)

Then composition (f(x(t)))_{t in [0,T]} is also an Ito process which satisfies

f(x(t)) =f(x(0))+ ∫[0,t] f’(X(s))A(s).ds + ∫[0,t] f’(X(s)) B(s) dW(s) + 0.5 ∫_[0,t] f’‘(X(s))(b(s))^2 ds

includes the Ito correction term

we can group together and write in terms of stochastic differentials

(also if B =0 usual chain rule)

114
Q

Theorem 3.6.1 (Itô’s formula in d = 1 CONVENENIENT WAY TO WRITE EQUALITIES in terms of stochastic differentials

A

Integral
X(t) =X(0)+ ∫[0,t] A(s).ds + ∫[0,t] B(s) dW(s)
can write as:

dX=A(t)dt + B(t)dw(t)

then
f(x(t)) =f(x(0))+ ∫[0,t] f’(X(s))A(s).ds + ∫[0,t] f’(X(s)) B(s) dW(s) + 0.5 ∫_[0,t] f’‘(X(s))(b(s))^2 ds

du(x(t)) = u’(x(t))A(t) + 0.5u”(X(t))(B(t))^2 .dt + u’(X(t))B(t).dW(t)

if we group we have a dt term and a dw term showing its an ito process

115
Q

stochastic integrals for integrals in
H_T
S_T
are martingales?

A

H_T martingales
S_T local martingales

116
Q

Summary class S_T in H_T or?

A

S_T contains stochastic processes that are F-adapted
for this class we have the property that elements have

P( integral_[0,T] X(s)^2 .ds <infinity)=1 is finite with probability 1

This means H_T contained in S_T, as H_T
are F adapted and have E[ integral_[0,,T] X(s)^2 .ds] <infinity,

because if the probability that it was infinite wasnt 0 then the expectation would be infinite

117
Q

summary ito process
The process X is an ito process if

A

there exists a process a in class D_T
and process B in class S_T s.t

X(t) =X(0)+ ∫[0,t] A(s).ds + ∫[0,t] B(s) dW(s)

when B =0 the function is differentiable wrt t

DETERMINISTIC INTEGRAL +STOCHASTIC INTEGRAL

118
Q

A way to remember the formula for an ito process
f(x(t)) =f(x(0))+ ∫[0,t] f’(X(s))A(s).ds + ∫[0,t] f’(X(s)) B(s) dW(s) + 0.5 ∫_[0,t] f’‘(X(s))(b(s))^2 ds

considering…

an ito process
dX=A(t)dt + B(t)dw(t)
with u in C^2
then
dX(t)=u’X(t)d(X(t) + 0.5u”d(X(t))^2

A

consider

X(t) =X(0)+ ∫[0,t] A(s).ds + ∫[0,t] B(s) dW(s)
dX=A(t)dt + B(t)dw(t)

write the table
(dt)^2 =0
dW(t).dt =0
(dW(t))^2 = dt
whenever we see these we write

YOU ONLY NEED TO REMEMBER THIS
dX(t)=u’X(t)d(X(t) + 0.5u”(d(X(t))^2

links to taylors expansion
first deriv dX +0.5 second deriv (dX)^2

119
Q

I will go through some exercises…..

A

these will be VERY VERY important for the exam

please also check the coursework and FEEDBACK and check the solutions, this is important for the exam too <3

already I gave you three exercises, cannot complain

120
Q

REMEMBERING
=u’(X(t)A(t)dt+ u’(X(t)) B(t)dW(t)+0.5u”(X(t))((B(t))^2 dt

write the table
(dt)^2 =0
dW(t).dt =0
(dW(t))^2 = dt
whenever we see these we write

YOU ONLY NEED TO REMEMBER THIS
dX(t)=u’X(t)d(X(t) + 0.5u”(d(X(t))^2

links to taylors expansion
first deriv dX +0.5 second deriv (dX)^2

A

recovering the formula:
dX(t)=u’(X(t))d(X(t) + 0.5u”(d(X(t))^2
=u’(X(t)A(t)dt+ B(t)dW(t) +0.5u”(X(t))(A(t)dt+ B(t)dW(t))^2

=u’(X(t)A(t)dt+ B(t)dW(t) +0.5u”(X(t))((A(t)dt)^2+ (B(t)dW(t))^2 +2A(t)B(t)dt dW(t))

using the table
=u’(X(t)A(t)dt+ u’(X(t)) B(t)dW(t)+0.5u”(X(t))((B(t))^2 dt
as we needed

121
Q

REMEMBERING
Suppose two continuously differentiable functions f and g then
fg product

in differentiable form
integral form

PRODUCT RULE

A

product is also differentiable
differential form
(fg)’ = f’g+g’f

f(t)g(t)=f(0)g(0)+ ∫{0,t}f’(s)g(s).ds +∫{0,t}f(s)g’(s).ds

integration by parts and product rule

122
Q

now consider ito processes:
Suppose (X(t))(t in [0,T]) and (Y(t))(t in [0,T]) are ito processes
Then the product X(t)Y(t)

ITO PRODUCT RULE

A

is an ito process and we have the equality

d(X(t)Y(t))= X(t)dY(t) + Y(t)dX(t) + dX(t)dY(t)

convenient to write in terms of differentials
consider the differential for X
Aₓ(t) dt+Bₓ(t)dW(t)
differential for Y
Aᵧ(t)dt+Bᵧ(t). dW(t)
replacing
d(X(t)Y(t))= X(t)dY(t) + Y(t)Aₓ(t) dt+Bₓ(t)dW(t) + dX(t)Aᵧ(t)dt+Bᵧ(t). dW(t)+ (Aᵧ(t)dt+Bᵧ(t). dW(t))(Aₓ(t) dt+Bₓ(t)dW(t))

write the table
(dt)^2 =0
dW(t).dt =0
(dW(t))^2 = dt
=(BₓBᵧ + (X(t)Aᵧ(t) + Y(t)Aₓ(t) )dt+ ( X(t)Bᵧ(t) +Y(t)Bₓ(t)).dW(t)
first part integration by parts

123
Q

IMPORTANT EXERCISE!!!
PART 1
3.7.06
Let W be a one dimensional F-Wiener processes and let
σ > 0. Set
Z(t) := exp (σW(t) −0.5σ²t)

  1. Show that Z_t satisfies the equation
    dZ(t) = σZₜ dW(t),
    Z_0 = 1.
  2. Conclude that (Z(t))_{t∈[0,T]} is a martingale.

Hint: Recall that if X ∼ N (0, 1) then E [exp(aX)] = exp(0.5a²)

This is an eg of a stochastic differential eq

A

1) this equality means that I want to show that Z is an ito process and it has a stochastic differential

recalling that ito process has the form
dX(t)=A(t)dt +B(t)dW(t)

dZ(t) = σZₜ dW(t), means A=0 B=σZₜ

Consider process X(t)= σW(t) -0.5σ² t
(exp argument)
Is X an ito process?
Can I write of the form above or equivalently of the form:
X(t)= X(0) +∫₀ᵗ A(s).ds + ∫₀ᵗ B(S) dW(s)

=0 +∫₀ᵗ -0.5σ² ds + ∫₀ᵗ σ dW(s) (YESSS)
so X is an ito process

writing as:
dX(t)=(-0.5σ²) dt +σdW(t)

Then using itos formula we have u(X(t)): exp(X(t))
Z(t)=u(X(t))
thus
dZ= du(X(t) = u’(X(t))dX(t) + 0.5u”(X(t)) (dX(t))²
substituting
dZ= u’(X(t))(-0.5σ²) dt +σdW(t) + 0.5u”(X(t)) ((-0.5σ²) dt +σdW(t)
from the table
=-0.5σ²u’(X(t))dt +σu’(X(t))dW(t) + 0.5u”(X(t))σ² dt
as u’=u as exponential
= σu’(X(t))dW(t)
=σZ(t) dW(t)

we want to check:
IC Z(0)=1 means
Z(0)= exp(σW(0)-0.5σ²(0))=exp(0)=1 as required

124
Q

IMPORTANT EXERCISE!!!
PART 2
3.7.06
Let W be a one dimensional F-Wiener processes and let
σ > 0. Set
Z(t) := exp (σW(t) −0.5σ²t)

  1. Show that Z_t satisfies the equation
    dZ(t) = σZₜ dW(t),
    Z_0 = 1.
  2. Conclude that (Z(t))_{t∈[0,T]} is a martingale.

Hint: Recall that if X ∼ N (0, 1) then E [exp(aX)] = exp(0.5a²)

This is an eg of a stochastic differential eqd

A

2)not gone through?
seems standard enough look through notes

next one similar? important next one

125
Q

what helps to remeber itos formula

A

dZ= du(X(t) = u’(X(t))dX(t) + 0.5u”(X(t)) (dX(t))²

replacing from the table

126
Q

IMPORTANT EXERCISE

Exercise 3.7.7.
Show that Z(t) = exp(t/2)cos(W(t)) is a martingale.

A

firstly Z(0)=1

Sufficient to show its an ito process and then that the integral dW(t) part is in H_T to show it is an F martingale
We use ito’s formula:
eᵗ/²cos(W(t)) cos part is a smooth function of brownian motion and thus expect to be an ito process and eᵗ/² expect to be ito process, thus product will also be.
ITO PROCESSES ARE THOSE WHICH ARE
DETERMINISTIC INTEGRAL + STOCHASTIC INTEGRAL

we also know that if the stochastic integral is in the class H_T it is a martingale

dX(t)=A(t)dt +B(t)dW(t)
X(t)= X(0) +∫₀ᵗ A(s).ds + ∫₀ᵗ B(S) dW(s)
STEP show ito processes
Y(t)=eᵗ/²
X(t)= cos(W(t))

*dY(t)= 0.5eᵗ/² dt

(we use itos formula for dX:
u(x)=cosx
X(t)=W(t)
dX(t)=u’X(t)d(X(t) + 0.5u”d(X(t))^2)))

*dX(t)= -sin(W(t)) dW(t) -0.5cos(W(t)) (dW(t))²
= -sin(W(t)) dW(t) - 0.5cos(W(t)) dt

STEP with the differentials of X and Y we find using itos product rule
d(XY)= X(t)dY(t) + Y(t)dX(t) +dX(t)dY(t)
= 0.5 cos(W(t))eᵗ/² dt+ eᵗ/²[-sin(W(t)) dW(t) - 0.5cos(W(t)) dt] +0.5eᵗ/² dt [-sin(W(t)) dW(t) - 0.5cos(W(t)) dt]

= 0.5 cos(W(t))eᵗ/² dt - eᵗ/²sin(W(t)) dW(t) - eᵗ/²0.5cos(W(t)) dt
using the table
=- eᵗ/²sin(W(t)) dW(t)

so dZ(t)= - eᵗ/²sin(W(t)) dW(t)

so Z(t)= 1-∫₀ᵗeᵗ/²sin(W(t)) dW(t)
so we have show Z(t)= constant +stochastic integral

We know the stochastic integral: if the element inside the integral is inside H_T it is a martingale

STEP show this
1)process (X(t))_{t in [0,T]} in H_T if by defn F adapted, F_t measurable for all t :
By defn W(t) is F_t measurable for all t in [0,T] since W is adapted
map composed with smooth continuous function is also F_t measurable
-eᵗ/²sin(W(t))

f(x)=-eᵗ/²sinx f:R to R continuous and therefore f(W(t))borel measurable also

2) check exp is finite
E [∫[0,T] X(s)² .ds ] = E[∫[0,T] eᵗ/²sin(W(t))² .dt]
≤E[∫_[0,T] exp(T/2).1dt
deterministic integral
=exp(T/2)T < infinity

so -0.5eᵗ/²sin(W(t)) dW(t) )_{t in [0,T]} in H_T
so -∫₀ᵗeᵗ/²sin(W(t)) dW(t) is an F-martingale
so Z(t)= 1 -∫₀ᵗeᵗ/²sin(W(t)) dW(t) is also an F-martingale
THIS WAS A SPECIFIC EXERCISE know very well!!

127
Q

I will discuss higher dims ito formula

A

The higher dimension ito’s formula:
I just want you to know the formula itself DEFINITELY
YOU SHOULD KNOW ITOS FORMULA
SHOULD KNOW PRODUCTS RULE
HOW TO USE
AS IN PREV EXERCISE

you should probably know this higher dim, you don’t need to remember this, it easy to remember, I will not ask for it but its nice to remember as makes it easier to derive the product one

128
Q

Exercise 3.7.8.Let W be a d-dimensional F-Wiener process and let a1, …, ad ∈ R be fixed. Find
c ∈ R such that the process
Z(t) = exp
ct +
X
d
i=1
aiWi(t)

is a martingale.

A

skipped?

129
Q

up to now 1D: itos formula

A

dX=A(t)dt + B(t)dw(t)

W(t) Ω to R
X(t) : Ω to R

130
Q

multiple dims ito’s formula

d_0 dimensional wiener process:

A

want to consider now
stochastic process
W(t): Ω to R^{d_0}
X(t) Ω to R^{d}

d_0 dimensional wiener process:
[missing line defining the process
W(t) = (W₁(t), W₂(t), …, W_{d₀}(t))???]
where each component
Wᵢ(t) is an R-valued F wiener process,
W_1,…W_{d_0 }independent
independent brownian motions in vector, give D_0 dimensional random variable
as time evolves stochastic process with values in R^{d_0}

131
Q

d dimensional ito process

A

want to consider now
stochastic process
W(t): Ω to R^{d_0}
X(t) Ω to R^{d}

Process X(t) =(X_1(t),…,X_D(t)) ^T
(in general D_0 not equal to D}
W=(w_1,…,W_d_0)
each component is adapted
and there exists
Aᵢin Dᵢ
Bᵢⱼ in S_T where i=1,..d, j=1,…d_0
s.t
each component has this form
Xᵢ(t)= Xᵢ(0) +∫₀ᵗ Aᵢ(s).ds + Σ_{j=1, d_0}BᵢⱼdWⱼ(s)
true for all i=1,…,d
For fixed i resembles ito process with d_0 brownian motions,
might have multiple sources
derivative in finance depending on multiple stocks

132
Q

vector notation of d dimensional ito process

Process X(t) =(X_1(t),…,X_D(t)) ^T
(in general D_0 not equal to D}

each component is adapted
and there exists
Aᵢin Dᵢ
Bᵢⱼ in S_T where i=1,..d, j=1,…d_0
s.t
Xᵢ(t)= Xᵢ(0) +∫₀ᵗ Aᵢ(s).ds + Σ_{j=1, d_0}BᵢⱼdWⱼ(s)
true for all i=1,…,d

A

(X_1(t))
(…)
(X_d(t))
+
integral_{0,t}
(A_1(s))
(…)
(A_d(s)).ds
+
(B_11(s)….B1d_0(s))
……
(B_d1(s)….B_dd_0(s))

(dW_1(s))
(…)
(dW_d_0 (s))

vectors and matrices
or x+
integral_{0,t}A.ds +
integral_{0,t} B dW(s)

noise values in R^d_0

output with values in R^d

133
Q

n dimensional ito formula

A

Aᵢin Dᵢ
Bᵢⱼ in S_T where i=1,..d, j=1,…d_0
s.t
Xᵢ(t)= Xᵢ(0) +∫₀ᵗ Aᵢ(s).ds + Σ_{j=1, d_0}BᵢⱼdWⱼ(s)
true for all i=1,…,d

then with function u from R^d to R

[missing du(X(t))] =Σ_{i=1 to d} u_{x_i}dX_i (t) + 0.5Σ_{i,j=1 to d} u_{xᵢxⱼ}(X(t)) dXᵢ(t)dXⱼ(t)

where u_{x_i} is the partial derivative of u wrt x_i ith component

(if d =1 only one partial derivative as expected)

also this is the non rigourous version of the one you SHOULD REMEMBER of saying that…[missing]

134
Q

given n dimensional ito process, if we take function
f: R^D to R
then
we have also an ito process

A

df(Xₜ)=∂ₓᵢ f(Xₜ) dXᵢ(t)+0.5∂²ₓᵢₓⱼf(Xₜ) dXᵢ(t)dXⱼ(t)
partial derivs

which help to remember the multi dimensional formula
ALSO
dtdWⱼ(t)=dWᵢ(t)dWⱼ(t)=(dt)²=0
i≠j

if i=j
(dWᵢ(t))² =dt

I will not ask you in the exam for the multidimensional formula but there is an exercise we will look at..

135
Q

EXERCISE 3.7.8
Let W be a d-dimensional F-Wiener process and let
a₁, …,a𝒹∈ R be fixed.

Find c ∈ R such that the process
Z(t) = exp(ct +Σ_{i=1 to d} aᵢWᵢ(t))

is a martingale.

A

(we saw prev that ito processes with purely stochastic part and no deterministic part are martingales because stochastic integrals are martingales)

(first look:looks like an ito process as has dt term and dw term, here set d=1?)

Step 1:
i=1
u(x)=exp(x) or he used f(x)=exp(x)
(derivatives of f all equal due to exp funct)
X(t) = ct +Σ_{ⱼ=1 to d} aⱼWⱼ(t))
dX(t)=c dt + Σ_{ⱼ=1 to d} aⱼ.dW(t)
This is exactly the n dim form of ito formula
for i=1
Aᵢin Dᵢ
Bᵢⱼ in S_T where i=1,..d, j=1,…d_0
s.t
Xᵢ(t)= Xᵢ(0) +∫₀ᵗ Aᵢ(s).ds + Σ_{j=1, d_0}BᵢⱼdWⱼ(s)

Z(t) = exp(ct +Σ_{ⱼ=1 to d} aⱼWⱼ(t)) = f(X(t))
Thus
dZ(t)=dF(X(t))
Then by itos lemma.? formula
= f’(x(t))dX(t) +0.5f”(X(t))(dX(t))²
(as derivs of f(X(t)) equal f(X(t)) they equal Z(t))
=Z(t)dX(t) +0.5Z(t) (dX(t))²

subbing in dX(t)
=Z(t)(c dt + Σⱼ₌₁ᵈ aⱼ .dW(t) +0.5Z(t) (c dt + Σⱼ₌₁ᵈ aⱼ .dW(t)
using table values BUT WITH MULTI DIM PART i=j
=Z(t)(c dt + Σⱼ₌₁ᵈ aⱼ .dW(t)+0.5Z(t)Σᵢ₌₁ᵈaᵢ² dt
=[Z(t)(c + 0.5Σᵢ₌₁ᵈaᵢ²) ] dt + Σⱼ₌₁ᵈ aⱼ .dW(t)
So to choose such a c that gives 0 deterministic part
c + 0.5Σᵢ₌₁ᵈaᵢ² =0
c=-0.5Σᵢ₌₁ᵈaᵢ²

;(to show the sum of the stochastic integral is a martingale see prev exercise)
Then:
dZ(t)=Z(t)(Σⱼ₌₁ᵈaⱼ²dWⱼ(t)

If aⱼZ(t) is in the class H_T then each integral
∫₀ᵗaⱼZ(t) dWⱼ(s)
is a martingale
thus
sum
Σⱼ₌₁ᵈ∫₀ᵗaⱼZ(t) dWⱼ(s)
is a martingale
THUS
It suffices to show (Z(t)){t in [0,T]} in H_T
for this we need to check it’s adapted:
Because if
Z(t)= exp(ct +Σ
{i=1 to d} aᵢWᵢ(t))
is F_t measurable
as if we fix t Wⱼ(t) are F_t measurable
continuous function
add constant
composition of continuous functions is F_t measurable still

2) Need to check also that
E[integral_[0,T] Z(s)^2 .ds]< infinity
E[ integral_[0,T][exp(ct +Σ_{i=1 to d} aᵢWᵢ(s)]^2 .ds]
=E[ integral_[0,T][exp(2ct +2Σ_{i=1 to d} aᵢWᵢ(s)] .ds]
by properties of exponential
=E[ integral_[0,T][exp(2ct) Π{i=1 to d} exp(2aᵢWᵢ(s))] .ds]
less than or equal to
exp(2|c|T)integral_[0,T] E[Π{i=1 to d} exp(2aᵢWᵢ(s))] .ds]]
by independence of W_i’s
exp(2|c|T)integral_[0,T] Π{i=1 to d} E[exp(2aᵢWᵢ(s))] .ds]
=exp(2|c|T)integral_[0,T] Π{i=1 to d} E[exp(√s2aᵢWᵢ(s)√s)] .ds]
we know Wᵢ(t)√t) has distribution N(0,1)
=exp(2|c|T)integral_[0,T] Π{i=1 to d} E[exp(√s2aᵢWᵢ(s)√s)] .ds]
=exp(2|c|T)integral_[0,T] exp( Σⱼ₌₁ᵈ√s2aⱼ) .ds]
using E[exp(aZ)]=exp(a^2/2)
exp(2|c|T) exp(2√T Σⱼ₌₁ᵈaⱼ) T < infinity

Thus if we choose c as above
dZ(t)=Z(t)(Σⱼ₌₁ᵈaⱼ²Wⱼ(t)
with Z(t)aⱼ in H_T
and thus Z is a martingale

In the case of level 3:
multi dimensional case not in exam
But if i would only deliver what would be asked in the exam I would have finished the module earlier!

136
Q

YOU SHOULD KNOW THIS
Z ~N(0,1) then
E(exp(aZ)]

W_t~ N(0,t)
then W_j/ square root(t)

A

exp(a^2/2)

then W_j/ square root(t) ~N(0,1)

137
Q

EXERCISE gone through
Exercise 3.7.9. Let u : [0, ∞) × R → R be a smooth bounded function satisfying the backward
heat equation
uₜ + 0.5uₓₓ= 0
Assume that there exists a constant C > 0 such that |uₓ(t, y)| ≤ C for all (t, y) ∈ [0, ∞)×R. Show
that if W is a one dimensional Wiener process, then for all t ≥ 0, we have
E[u(t, W(t))] = u(0, 0).

A

first deriv wrt time second wrt space

We use itos formula/lemma
du(t,W(t))=
∂ₜu(t,W(t))dt + ∂ₓu(t,W(t))dW(t)+0.5 ∂ₓₓ²u(t,W(t))dt
=
[∂ₜu(t,W(t)) + 0.5∂ₓₓ²u(t,W(t))]dt + ∂ₓu(t,W(t))dW(t)
first part partial diff eq=0

so du(t,W(t))
=∂ₓu(t,W(t))dW(t)
means in integral form

u(t,W(t))=u(0,W(0))+ ∫₀ᵗ ∂ₓu(s,W(s))dW(s)

because its brownian motion W(0)=0
u(t,W(t))=u(0,0)+ ∫₀ᵗ ∂ₓu(s,W(s))dW(s)

taking expectations we have required
E[u(t,W(t))]= u(0,0)
as first is deterministic
and expectation of stochastic integral is 0 PROVIDED the integral term is in the class H_T:

1) W(t) is F_t measurable, u is smooth, ∂ₓ continuous in x so it is also borel in x, thus the composition of these is F_t measurable

2) E[∫[0,T][∂ₓu(t,W(t))]^2 .dt]
≤E[∫
[0,T][C^2 .dt] = TC^2 <infinity
by assumption of bounded by C

138
Q

Note about prev exercise

A

in general connection between PDEs and ito processes

apply itos formula
for derivs in space and time
if funct satisfies PDE involving these terms we use this

also we will see we can represent sols using ito processes
using expectations

instead of solving approximate underlying process

139
Q

In the case of level 3:
multi dimensional case not in exam
But if i would only deliver what would be asked in the exam I would have finished the module earlier!

WHY I MENTION HIGHER DIMS?

f(t,X(t))

A

Suppose we have an ito process which looks like this
CANT SEE THE BOARD
formula tells you how __evolves
if function dep on x
function also depends on time
f(X(t)

we can see from higher dimensions this is actually two dimensional setting components y_1=x and y_2 =t

I have a function of two variables
f(t,X(t))=g(Y(t)
g(y):= g(y₁,y₂)=f(y_1,y_2)

Then I know from multi dimensional formula
differentials
df(t,X(t)= dg(Y(t))
from itos formula using partial derivatives
= ∂_y₁g(Y(t))dY¹(t)+∂_y₂g(Y(t))dY²(t)+0.5∂²_y₂y₂g(Y(t))(dY²)²
partial deriv of g wrt y_1 is the partial deriv of f wrt first variable, here its the time variable
second derivative of g wrt second variable is the same as the derivative of f wrt to the second variable, called x

= ∂ₜf(t,X(t))dt+∂ₓf(t,X(t))dX(t)+0.5∂²ₓₓf(t,X(t))(B²(t))dt

(when i=1 then product means is 0, same with 1. Only case 2 survives)
one case:
second partial derivative of f wrt y_2
so
df(t,X(t)) =
[∂ₜf(t,X(t))+∂ₓf(t,X(t))A(t)+0.5∂²ₓₓf(t,X(t))(B²(t))]dt +∂ₓf(t,X(t))B(t)dW(t)

if you dont have time dependence you have the same without the first term here and in this case the dt part usual ito formula without t

so extra term if time dependence partial deriv of f wrt t
USUAL ITO FORMULA +EXTRA TERM

140
Q

FROM LECTURES

Filtration (F_t)_{t in [0,T]}
assume
F_t ⊆F
we assume
all sigma algebra equals in this case
F=F_T

2 probability measure P and Q on (Ω,F) are called equivalent if

A

for all A ∈ F_T , P(A) = 0 if and only if Q(A) = 0

there exist two probability measures that equal measure probability 0 for the exact same events

probability measure might be different for any two non zero values

(thm 5.1.1)

141
Q

FROM LECTURES

Filtration (F_t)_{t in [0,T]}
assume
F_t ⊆F
we assume
all sigma algebra equals in this case
F=F_T (filtration coincides with the last time T)

Suppose we have stochastic process (B_t)_{t in [0,T]}
in S_T

define stochastic process
Z(t)=exp(- ∫₀ᵗB(s).dW(s))-0.5 ∫₀ᵗ(B²(s) .ds

A

in S_T: integrals for stochastic integrals, meaning i can integrate wrt W

Z(t) exists because of this fact
note Z(T)>=0

since non negative we can take the expectation; we don’t know if finite but for non neg RV can be taken
Brownian motion wrt measure??

Define
P❋(A):= E[1_A Z(T)]
with expectation wrt measure p
which defines function

P❋:F to [0,+infinity]
Girsanov’s theorem tells us the following…

Assume that (Z(t)){t in[0,T]} is a martingale wrt filtration (F_t){t in [0,T]}

142
Q

Girsanov’s theorem tells us the following

in S_T: integrals for stochastic integrals, meaning i can integrate wrt W

Z(t) exists because of this fact
note Z(T)>=0

since non negative we can take the expectation; we don’t know if finite but for non neg RV can be taken
Brownian motion wrt measure??

Define
P❋(A):= E[1_A Z(T)]
with expectation wrt measure p
which defines function

define a new R^d valued ito process
W∗(t) =∫₀ᵗB(s) ds + W(t), t ∈ [0, T].

P❋:F to [0,+infinity]
Girsanov’s theorem tells us the following…

A

Assuming (Z(t)){t in[0,T]} is a martingale wrt filtration (F_t){t in [0,T]}
THEN

i)P❋ is a probability measure on(Ω, F) and it is equivalent to P.

(ii) The process (W∗(t))t∈[0,T] defined is an F-Wiener process on (Ω, F, P∗)

143
Q

meaning behind Girsanov’s theorem

A

equivalent probability measures

statistical properties of W∗
in general
P(W∗(t) in A) not equal to P∗(W∗(t) in A)

probabilities not necessary the same meaning expectations and variance is not the same

but thm tells us that under the P∗ measure, W∗is a brownian motion:
indep increments
gaussian distribution
satisifies BM properties!

144
Q

might be a good exercise to check:
Because Z(t) is a martingale
Novikov’s condition

A

says that If
E[exp(0.5 ∫₀ᵗ |B(s)|^2 ds] < ∞.

Then (Z(t))_{t∈[0,T]}
is a martingale with respect to F.
———————–

for example if we have process B which satisfies the exponential of this integral is finite then we have Z(t) martingale
In particular if B is a stochastic process which is bounded by constant C then of course finite
as its an exp of an ito process we can use itos formula to express as ito process, dt parts cancel so stochastic integral which is a martingale if in H_T?
———–
notes: If
E[exp(0.5 Σ_{i=1,d} |B_i(s)|^2 ds] < ∞.

145
Q

skipped but loosely mentioned
corollary bounded B Exponential of

A

Corollary 3.8.3. Assume that B ∈ Sd×1_ T
is bounded, that is, there exists a constant C such that
for all i = 1, …, d and all (ω, t) ∈ Ω × [0, T] we have |B_i(ω, t)| ≤ C. Then, the conclusions of
Theorem 3.8.1 hold.