8 Random Processes: Continuous-time Markov Chains Flashcards

1
Q

continuous-time Markov chains

A

In many applications time flows continuously and events (births, deaths, …) can happen at continuous random time t∈ [0, ∞)
(time increments are random, have a distribution)

consider finite/countable state space
X(t)∈ S = {0, 1, 2, . . . }

CTMC

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

CTMC memoryless

A

satisfy Markov’s property

Prob{X(tₙ₊₁) = iₙ₊₁ | X(t₀)= i₀,…,X(tₙ)=iₙ}
= Prob{X(tₙ₊₁)=iₙ₊₁|X(tₙ)=iₙ}

the probability of being in state iₙ₊₁ at tₙ₊₁ depends ONLY on being in state iₙ at tₙ

not on entire history

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

CTMC probability distribution

A

A CTMC
X(t) ∈ S = {0, 1, 2, . . . }
s characterized by its probability distribution
{p₀(t), p₁(t), ₂(t). . .}

where
pᵢ(t) = Prob{X(t) = i ∈ S}
(process at time t has value i)

and is the vector of probability
p(t) = (p₀(t), p₁(t), ₂(t). . .)^T

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

pᵢⱼ

A

Transition probability
pⱼᵢ(t,s)

gives the probability to move from state i at time s at state j at time t>s

pⱼᵢ(t,s)= pⱼᵢ(t-s) =Prob{X(t)=j|X(s)=i}
=Prob{X(t-s)=j|X(0)=i} with t>s

writing
pⱼᵢ(t,s)=pⱼᵢ(t-s)=p(∆t)
with ∆t=t-s >0
we assume time is homogenous

assume time is homogeneous meaning only time difference t-s>0 matters

the probability only dep on the time difference

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

stochastic transition matrix

A

As for DTMCs, the transition probability between state i at time s to state j at time t>s is specified by
the element pⱼᵢ(t,s)=pⱼᵢ(t-s)=pⱼᵢ(∆t) of the stochastic transition matrix

P(∆t)=(pⱼᵢ(∆t))
probability conservation
Σ_{k∈S} pₖⱼ(∆t)=1

each column of P(∆t) adds up to 1

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Relation between transition probability and probability distribution is

A

product:

p(∆t)
= P(∆t) x p(0)

prob. vector after ∆t=
transition matrix prob. x vector at t = 0

if initial state X(0)= i ∈ S

then pₖ(0) =δₖ,ᵢ

thus p(0)=
(0)
….
(1)
(0) element i

(0)
p_{j∈S}(∆t) = Σ_{k∈S} pⱼₖ(∆t)δₖ,ᵢ
= pⱼᵢ(∆t)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

δₖ,ᵢ =

A

δₖ,ᵢ =
{1 if i = k
{0 if i ̸= k

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

The Chapman-Kolmogorov equation for CTMCs in matrix form

A

P(s)P(t) = P(t + s)

All this is similar to what we have seen for discrete-time Markov chains

However, in CTMCs:
- Events can occur at random time that flows continuously (there is no fixed, discrete time increment)

=>Time between two events is randomly distributed

  • CTMCs are now analyzed in terms of transition rates
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

8.2
transition rates

A

qⱼᵢ
are defined in terms of transition prob. as
qⱼᵢ= lim_{∆t→0} of
[pⱼᵢ(∆t)-pⱼᵢ(0)]/ ∆t
=
{lim_{∆t→0}pⱼᵢ(∆t)/∆t if i ̸= j
{lim_{∆t→0} (pⱼⱼ (∆t)−1)/∆t if i = j,

Since pⱼᵢ(∆t) ≥ 0 and ∆t > 0
⇒ qⱼᵢ ≥ 0 if i ̸= j

where pⱼᵢ(0)= δᵢ,ⱼ

P(0)= I identity matrix
means
pⱼᵢ(∆t) = δⱼ,ᵢ+ qⱼᵢ∆t + o(∆t)
h.o.t negligible/ inifinitesimally small, such that lim_{∆t→0} o(∆t)/∆t = 0
When ∆t → 0 (with o(∆t) negligible):

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

When ∆t → 0 (with o(∆t) negligible):

A

Since

(kronecker delta function used, taking sum over j)
Σ_{j∈S} pⱼᵢ
= 1
= 1 + ∆tΣ_{j∈S}qⱼᵢ
= 1 + ∆t[ Σ_{j∈S/i}qⱼᵢ + qⱼⱼ]
(here we demand [] sums to 0 for j not equal to i)
giving
qⱼᵢ = − Σ_{j∈S/i} qⱼᵢ

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

GENERATOR MATRIX

A

Q=
(qⱼᵢ)

whose columns add up to zero and
whose off-diagonal element is non-negative and is the rate of the transition

i to j
(qⱼᵢ)

Since P(0) = I, for a CTMC of range S={0,1,2…} , the generator matrix explicitly reads:

(qⱼᵢ)=Q=
lim_{∆t→0} [P(∆t) − I]/((∆t)

=

[ () q₀₁ q₀₂ ….]
[ q₁₀ (
.) q₁₂…]
[ q₂₀ q₂₁ (*..) …]
[… ]
s.t all ≥ 0

from state i=0 from state i=1…
to state
j=0
j=1

  • = -Σ_{k∈S/0} qₖ₀
    *.= -Σ_{k∈S/1} qₖ₁
    *..= -Σ_{k∈S/1} qₖ₂

note: each element columns add to 0

except for diagonal all elements correspond to transition rates

gen matrix doesnt dep on delta t

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

The Chapman-Kolmogorov equation says

A

P(t + ∆t) = P(∆t)P(t)

pⱼᵢ(t + ∆t) = Σ_{k∈S} pⱼₖ(∆t)pₖᵢ(t)
= Σ_{k∈S} pₖᵢ(t)[δⱼₖ + qⱼₖ∆t + o(∆t)]

now [δⱼₖ ] is 0 or 1 dep on if j=k

REARRANGING
in the limit ∆t → 0
we obtain the master equation
lim_{∆t→0} of
pⱼᵢ(t + ∆t) - pⱼᵢ(t)/∆t
=
dpⱼᵢ(t)/dt
=
Σ_{k∈S} qⱼₖpₖᵢ(t)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

matrix form, the master equation

A

formal sol

d/dt P(t) = Q P(t)

giving
P(t)=
exp(Qt)P(0)
= exp(Qt)

where P(0) =I
and we clearly have
P(t₁ + t₂) = P(t₁)P(t₂)

(exponential of a matrix (generally
is nonsymmetric and large)

which come from (diagonal?) elements

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

using
p(t + ∆t) = P(∆t)p(t)

A

as
[p(t + ∆t) − p(t)]/∆t =
[P(∆t) − I]p(t)/∆t
in the limit∆t → 0

obtain
MASTER EQUATION FOR THE PROBABILITY VECTOR
(d/dt) p(t) = Q p(t)

which gives the equation of motion of the probability vector
p(t) = exp(Qt)p(0)

solution

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

relationship between probability vector and transition matrix

when given IC….

A

often the initital distribution p(0) is known and the relationship between transition probabilities and probability distribution is simple.

For instance, if initially,
X(0) = k ∈ S
gives
pᵢ(0) = δᵢ,ₖ

and
pᵢₖ(t) = Prob {X(t) =i | X(0)=k}
=
Prob {X(t)=i)= pᵢ(t)

so given IC
we work out Q
depeneds on the model of the process
then use sol

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

continuous-time Markov chains with discrete state space

A

S={0,1,2,…}
states

continuous time t ∈ [0, ∞)
with randomly distributed time between events

when time is continuous the delta t is infinitely small and we have randomly distributed,
probability of each states given by master equation, giving the distribution of the probabilities

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

master equation
continuous-time MC

A

d/dt p(t) =Qp(t)

Q= (qⱼᵢ)

where qⱼᵢ≥0
is the rate of i −qⱼᵢ→ j for i ̸= j

each state of system

Q generator matrix, tranisiton rates, each col sums to 0

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

A Poisson process is

discrete state
continuous time

e.g. number of goals in a football game

A

a CTMC with and de {X(t) ∈ S : t ∈ [0, ∞)} S = {0, 1, 2, . . .}, λ > 0, ∆t → 0 defined by:

*At t = 0: X(0) = 0
* pᵢ₊₁(∆t) = Prob {X(t + ∆t) = i + 1|X(t) = i} = λ∆t + o(∆t)
(∆t so small we can only move +1,0 in ∆t)

  • pᵢᵢ(∆t) = Prob {X(t + ∆t) = i|X(t) = i} = 1 − λ∆t + o(∆t)

*pⱼᵢ(∆t) = Prob {X(t + ∆t) = j ≥ i + 2|X(t) = i} = o(∆t) if j ≥ i + 2
* pⱼᵢ(∆t) = 0 if j < i

where o(∆t) is infinitesimally small s.t lim_{∆t→0} o(∆t)/∆t = 0

, i.e. that decrease to zero faster
than ∆t

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

In the Poisson process, there are only transitions into higher states (never down): summary

A

i −λ→ i + 1,
i −o(∆t)→ j > i + 1,
and i −prob. 0→ j < i

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
20
Q

For the Poisson process , the transition prob. in is ∆t
pⱼₖ(∆t)
related to
qⱼₖ

A

pⱼₖ(∆t) = δⱼ,ₖ + qⱼₖ∆t + o(∆t)
when ∆t → 0
qⱼₖ= λ(δⱼ,ₖ₊₁ - δⱼ,ₖ)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
21
Q

Generator matrix Poisson process

A

Q= (qⱼₖ) =
[−λ 0 0 0 0 . . .]
[λ −λ 0 0 0 . . .]
[0 λ −λ 0 0 . . .]
[0 0 λ −λ 0 . . .]
[. … … …]

0 except in 2 cases
j=k+1
or
j=k

sum of columns is 0
guarantees probability conservation

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
22
Q

Given IC X(0)=0
pᵢ(0)=

A

pᵢ(0)=δᵢ,₀ = pᵢ,₀(0) and

pᵢ(t)= pᵢ,₀(t)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
23
Q

master eq

A

d/dt p(t) =Qp(t)

with probability vector
p(t)= (p₀(t),p₁(t),…)^T

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
24
Q

system of coupled linear ODES:

d/dt p₀(t)

master equation

A

d/dt p₀(t)= −λp₀(t)
with p₀(0)=1

giving
p₀(t)=exp(−λt)

(p₀)
(p₁
)
(…)
=
[−λ 0 0 0 0 . . .]
[λ −λ 0 0 0 . . .]
[0 λ −λ 0 0 . . .]
[0 0 λ −λ 0 . . .]
[. … … …]
*
(p₀)
(p₁)
(…)

with pᵢ(0)=
{1 if i=0
{0 if i not equal to 0

uses the time derivative of each component

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
25
Q

d/dt p₁(t)

system of coupled linear ODES:

master equation

A

d/dt p₁(t)
= λp₀ − λp₁ = −λp₁(t) + λexp(−λt)

with p₁(0) = 0

p₁(t) = λtexp(−λt)

so probability vanishes exponentially in time

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
26
Q

d/dt p₂(t)

system of coupled linear ODES:

master equation

A

d/dt p₂(t)
= λp₁ −λp₂
= −λp₂(t) +λ²texp(−λt),
with p₂(0) = 0 ⇒
p₂(t) = ((λt)²/2)exp(−λt)

d/dt pᵢ(t)
= λ(pᵢ₋₁ − pᵢ)
= −λpᵢ(t) + λ(λt)ᶦ⁻¹/(i − 1)! exp(−λt)
with IC
pᵢ(0) = 0 for i > 0
⇒ pᵢ(t) = ((λt)ᶦ/i!) exp(−λt)

We recognize the Poisson distribution of parameter λt

mean E(X(t)]=Var(X(t))=λt
standard dev square root (λt)

as time goes on more pronounced deviation

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
27
Q

probability to remain in state 0

This is an example of an important general results: in CTMCs, the time between 2 events is

A

vanishes exponentially in t

p_0(t)=exp(-λt)

Prob{{i = 0 → i = 1 at time > t} = exp(−λt)

Prob {i = 0 → i = 1 at time ≤ t}
= 1 − exp(−λt)

This is an example of an important general results: in CTMCs, the time between 2 events is exponentially
distributed

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
28
Q

Stationary distribution

continuous time MC has unique SD

A

Under generic conditions the CTMC
{X(t) ∈ S : t ∈ [0, ∞)}

with state space
S={0,1,2,…}

has a unique stationary distribution

lim_{t→∞} p(t)= π
= (π₀,π₁,π₂,…)^T
with
Σ_{i∈S} πᵢ = 1
(probability conservation)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
29
Q

Stationarity

A

=0

P(t)π = π

Q= lim_{∆t→0} [P(∆t) - I]/∆t

identity matrix I

Hence the stationary distribution is the right eigenvector of the generator matrix Q with eigenvalue 0

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
30
Q

In general:
At stationarity

A

P(t) π = π
with lim_[t→∞} p(t) = π

⇒** ⃗π** is also the right eigenvector of P with eigenvalue 1

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
31
Q

birth and death processes

A

Often the number of individuals in an evolutionary process changes at most by in each small time
increment; these are called birth (+1) or death (-1) events.
The class of CTMCs with this property are called “birth-and-death” processes (BDPs)

time is continuous states are discrete

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
32
Q

birth and death
rates

A

{X(t) ∈ S : t ∈ [0, ∞)} with state space S = {0, 1, 2, . . . }

birth: i → i + 1 at rate λᵢ
death: i → i − 1 at rate µᵢ

e.g start at 2 can move in delta t to 1 or 3

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
33
Q

birth and death derive probability of transitions:
{X(t) ∈ S : t ∈ [0, ∞)} with state space S = {0, 1, 2, . . . }

birth: i → i + 1 at rate λᵢ
death: i → i − 1 at rate µᵢ

A

i → i + k in ∆t :

pᵢ₊ₖ ᵢ(∆t) = Prob {∆X(t) = k|X(t) = i} =

{λᵢ∆t + o(∆t) if k = 1 (birth in state i)
{µᵢ∆t + o(∆t) if k = −1(death state i)
{1 − (λi + µi)∆t + o(∆t) if k = 0 (no birth and no death)
{o(∆t) otherwise (negligible)

Probability of transition
i

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
34
Q

Since there are no deaths in state 0 (no individuals), we have µ₀

A

µ₀ = 0

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
35
Q

If the state space is
S = {0, 1, . . . , N − 1, N}
λₙ=

A

λₙ=0
since there cannot
be a birth with N− λₙ=0→ N + 1

here the population is said to not be able to exceed N. In this case
can only reach state N from N-1
at state N
either absorbing stays there
or cant go N+1 but can go right

matrix q_n_ will tell us

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
36
Q

When the state space is bounded and

S = {0, 1, . . . , N − 1, N} =⇒
Generator matrix:

A

qⱼ₊ₖ ⱼ =
lim_{∆t→0}
[pⱼ₊ₖ ⱼ (∆t) − pⱼ₊ₖ ⱼ(0)]/∆t
=
{λⱼ if k = 1
{µⱼ if k = −1
{−(λⱼ + µⱼ) if k = 0
{0 otherwise

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
37
Q

Boundaries:
We need to consider the case j+k=0 and j+k=N separately since we have

and µ_0 = 0 λ_N = 0

p₀ⱼ(∆t)=
pₙⱼ(∆t)=

A

p₀ⱼ(∆t)=
{µ₁∆t + o(∆t) if j = 1
{1 − λ₀∆t + o(∆t) if j = 0
{o(∆t) otherwise

pₙⱼ(∆t)=
{λₙ₋₁∆t + o(∆t) if j = N − 1
{1 − µₙ ∆t + o(∆t) if j = N
{o(∆t) otherwise.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
38
Q

p₀ⱼ(∆t)=
{µ₁∆t + o(∆t) if j = 1
{1 − λ₀∆t + o(∆t) if j = 0
{o(∆t) otherwise

pₙⱼ(∆t)=
{λₙ₋₁∆t + o(∆t) if j = N − 1
{1 − µₙ ∆t + o(∆t) if j = N
{o(∆t) otherwise.

giving q₀ⱼ=

A

q₀ⱼ=
lim_{∆t→0}
[p₀ⱼ (∆t) − p₀ⱼ (0)]/∆t =

{µ₁ if j = 1
{−λ₀ if j = 0
{0 otherwise.

qₙⱼ=
lim_{∆t→0}
[pₙⱼ (∆t) − pₙⱼ (0)]/∆t =

{λₙ₋₁ if j = N − 1
{−µₙ if j = N
{0 otherwise.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
39
Q

SUMMARY:
class of continuous time MCs

A

We look at birth death processes

here only change in states are either go up by unit in time increment delta t BIRTH
go down one state DEATH

rates are lambda_i and u_i

index i depends on state we are in
So we wrote down the mastic equation
the probabilities
derive the generator matrix Q

in general we have three non zero rates (birth, death, neither) the sum of elements of a column has to add to one

40
Q

we have to take care about what happens at boundaries.

A

e.g for states 0
We could reach state 0 only if there was a death in state 1
i=1 j=0 and there is a death
corresponding rates differ here e.g birth rate =0 at 0

meaning
p₀ⱼ(∆t)=
{µ₁∆t if j = 1
{1 − λ₀∆t if j = 0
{o(∆t) otherwise

if state 0 is absorbing: λ₀=0 and we stay there

we might assume λ₀>0

41
Q

RECAP:
Let X(t) be a birth and death process
{X(t) ∈ S : t ∈ [0, ∞)}
on state space
{1,2,3,,….,N}

RATES
(BIRTH AND DEATH)

A

This means the state space is bounded

birth
i → i + 1 at rate λᵢ
death
i → i − 1 at rate µᵢ

from 0 to 1: λ_1
from 1 to 0:µ_1
from 1 to 2:λ_2
etc
In general the birth and death rates vary with i

only death from j = 1 into j = 0
no birth from j = 0

only birth from j = N − 1 into j = N

42
Q

RECAP:
Let X(t) be a birth and death process
{X(t) ∈ S : t ∈ [0, ∞)}
on state space
{1,2,3,,….,N}
TRANSITION RATES

qⱼ₊ₖ,ⱼ

A

for j + k ̸= 0, N

qⱼ₊ₖ,ⱼ=
{λⱼ if k = 1
{µⱼ if k = −1
{−(λⱼ+ µⱼ) if k = 0
{0 otherwise

his notation is backwards to what im used to grrrr

43
Q

RECAP:
Let X(t) be a birth and death process
{X(t) ∈ S : t ∈ [0, ∞)}
on state space
{1,2,3,,….,N}
TRANSITION RATES

q₀ⱼ

A

q₀ⱼ=
{µ₁ if j = 1
{−λ₀ if j = 0
{0 otherwise.
only death from j = 1 into j = 0
no birth from j = 0

INTO 0

birth rate at 0 might be 0 or maybe not… dep on model −λ₀

44
Q

RECAP:
Let X(t) be a birth and death process
{X(t) ∈ S : t ∈ [0, ∞)}
on state space
{1,2,3,,….,N}
TRANSITION RATES

qₙⱼ

A

capital N here

qₙⱼ=
{λₙ₋₁ if j = N − 1
{−µₙ if j = N
{0 otherwise.

only birth from j = N − 1 into j = N

INTO N

45
Q

RECAP:
Let X(t) be a birth and death process
{X(t) ∈ S : t ∈ [0, ∞)}
on state space
{1,2,3,,….,N}
GENERATOR MATRIX

A

These will have columns/rows with 3 non zero entries except for the first and last (from the boundary transition rates)

columns sum to 0 for first
negative values in diagonal

pattern each row:
birth neither death
negative on diagonal (usually niether)

each column i: at state i where can we transition
column is death niether() birth
columns sum to 0
death rates above diagonal
birth rates below diagonal

Q=

[−λ₀ µ₁ 0 . . . . . . . . . 0]
[λ₀ −(λ₁ + µ₁) µ₂ 0 . . . . . . .]

[0 λ₁ −(λ₂₋ + µ₂) … … 0 ….].
[0 0 λ₂… … … .]
[0 0 0 ,….]
[………]
[. 0 … … … 0 … λₙ₋₂ −(λₙ+₁ + µₙ₋₁ ) µₙ]
[0 . . . . . . . . . 0 λₙ₋₁ −µₙ]

last and first rows differ due to boundary conditions

46
Q

RECAP:
Let X(t) be a birth and death process
{X(t) ∈ S : t ∈ [0, ∞)}
on state space
{1,2,3,,….,N}

MASTER EQUATION
for probability vector

A

probability vector
p(t)(p₀(t),p₁(t),p₂(t),…,pₙ(t))

d/dt p(t) = Qp(t)

giving

p(t)= exp(Qt) p(0)

generator matrix times p

if we can take exponential we are done but can be complicated for this matrix
wjhat we find is that for each p_i we have a coupled differentil-difference equation

47
Q

RECAP:
Let X(t) be a birth and death process
{X(t) ∈ S : t ∈ [0, ∞)}
on state space
{1,2,3,,….,N}

for each p_i we have a differential-difference equation coupled with other such equations
for j ∈ {1, 2, . . . , N − 1}

A

from the entries of the matrix Qp

(d/dt)pⱼ
= (λⱼ₋₁pⱼ₋₁(t) + µⱼ₊₁pⱼ₊₁(t) )-(λⱼ +µⱼ)pⱼ
GAIN TERMS - LOSS TERMS

for j ∈ {1, 2, . . . , N − 1}

“balance equation” with gain terms
for birth/death reactions flowing
into j, and loss terms for those
flowing out of
ENTER j from j-1
ENTER j from j+1

EXIT J to j+1
EXIT J to j-1

(solving this is non trivial, p_i dep on others!)

48
Q

RECAP:
Let X(t) be a birth and death process
{X(t) ∈ S : t ∈ [0, ∞)}
on state space
{1,2,3,,….,N}

for each p_i we have a differential-difference equation coupled with other such equations

The equations at the boundaries

A

equations at the boundaries j=0 and j=N:

(d/dt)p₀(t) = µ₁p₁(t) − λ₀p₀(t)

no death from j=0
no birth from j=-1

dtpN (t) = λ ₙ₋₁pₙ₋₁(t) − µₙpNₙ(t)
no birth from j=N
no death from j=N+1
enter only by birth in n-1
exit only by death state n

IN TTOAL N COUPLED LINEAR ODES

49
Q

Let X(t) be a birth and death process
{X(t) ∈ S : t ∈ [0, ∞)}
on state space

state space is unbounded

A

S = {0, 1, 2, . . . , ∞}
the above special equation for
j=N is replaced by the first equation(the general ones?) that is then valid for

j ∈ {1, 2, . . . , ∞}

and coupled to the special
equation for j=0.

50
Q

How to solve these equations and find p_j∈S (t)

A

Generating functions (when possible), otherwise get useful info from the moments of X(t)

51
Q

Simple birth process (Yule process) intro

A

This is a process with only birth events occurring at a LINEAR rate

“simple birth process” with rate ,
=> λ_j = jλ

and no death =
µ_j = 0

52
Q

Simple birth process (Yule process)

linear birth rate
λ_j = jλ

state space
S = {N, N + 1, . . . , ∞}
X(0)=N>0

A

unbounded since the number of individuals can only grow without limit from a given IC N

53
Q

Simple birth process (Yule process)

linear birth rate
λ_j = jλ

state space
S = {N, N + 1, . . . , ∞}
X(0)=N>0

TRANSITION RATES

A

q_{j+k, j}

{jλ if k = 1
(−jλ if k = 0
(0 otherwise

initially
X(0) = N ⇒
p_j (0) = δ_{j,N}
=
{1 if j = N
{ 0 otherwise

54
Q

Simple birth process (Yule process)

linear birth rate
λ_j = jλ

state space
S = {N, N + 1, . . . , ∞}
X(0)=N>0

GENERATOR MATRIX

A

FROM q_{j+k, j}

{jλ if k = 1
(−jλ if k = 0
(0 otherwise

initially
X(0) = N ⇒
p_j (0) = δ_{j,N}
=
{1 if j = N
{ 0 otherwise

Q=
[0 0…]
[0 -λ 0 …]
[0 λ -2λ 0…]
[0 0 2λ -3λ 0 …]
….]
[0…0 λ(n-1) 0]

all columns add to 0
death rates 0
birth rates λ multiplied by n

55
Q

Simple birth process (Yule process)

linear birth rate
λ_j = jλ

state space
S = {N, N + 1, . . . , ∞}
X(0)=N>0
MASTER EQUATION

(d/dt)p_j

A

(d/dt)ⱼ
= λ[(j − 1)pⱼ₋₁(t) − jpⱼ ],

for j = N, N + 1, . . . ,

(d/dt)pⱼ(t)=0
for j=0,1,…,N-1

Since X(t) = j ∈ {N, N + 1, . . . },
j can never be < N
Equivalently: p_j<N = 0

56
Q

Simple birth process (Yule process)

linear birth rate
λ_j = jλ

state space
S = {N, N + 1, . . . , ∞}
X(0)=N>0

FINDING pⱼ using the generating function

A

pⱼ

using the generating function

G(z, t) = E(zˣ⁽ᵗ⁾)
=Σ_{j=0,∞} pⱼ (t)zʲ
=Σ_{j=N,∞} pⱼ (t)zʲ

Idea: use the ME to find an equation (PDE) for , solve that equation and G find a way to obtain p_i(t) from G

57
Q

Simple birth process (Yule process)

linear birth rate
λ_j = jλ

state space
S = {N, N + 1, . . . , ∞}
X(0)=N>0

Idea: use the ME to find an equation (PDE) for , solve that equation and G find a way to obtain p_i(t) from G

A

∂G(z, t)/∂t

(d/dt) (Σ_{j=0,∞} pⱼ (t)zʲ)
from ME
=
λ Σⱼ₌ₙ₊₁∞[(j-1)pⱼ₋₁zʲ] − λ Σⱼ₌ₙ∞ jpⱼzʲ
starting with j=N gives 0

=
λz² Σⱼ₌ₙ∞jpⱼzʲ⁻¹] − λz Σⱼ₌ₙ∞ [jpⱼzʲ⁻¹]
shifting j to j-1
second sum is ∂G/∂z

=
λz(z-1) (∂G(z,t)/∂z)

with G(z,0) = Σⱼ₌ₙ∞ jpⱼ(0)zʲ = z^N
pⱼ(0) = 1 if j=N , 0 otherwise

58
Q

with G(z,0) = Σⱼ₌ₙ∞ jpⱼ(0)zʲ

A

with G(z,0) = Σⱼ₌ₙ∞ jpⱼ(0)zʲ = z^N
pⱼ(0) = 1 if j=N , 0 otherwise

59
Q

Simple birth process (Yule process)

linear birth rate
λ_j = jλ

state space
S = {N, N + 1, . . . , ∞}
X(0)=N>0

The linear PDE for i
∂G(z, t)/∂t
=
λz(z-1) (∂G(z,t)/∂z)

with G(z,0) = Σⱼ₌ₙ∞ jpⱼ(0)zʲ = z^N

for the master eq

A

G(z,t)

s simple enough to be solvable using the method of the characteristics (outside
the scope of this module) =>

(z^N exp(-λNt))/
[1-z(1-exp(-λt))]^N

We recognize the generating function of a negative binomial distribution (see App. D, Sec.7.4/7.5.)
From the properties of the negative binomial distribution, we obtain the solution of the ME:
p_{j+N}(t)
=
(N +J-1)
( j )
Exp(-λNt) (1- exp(-λt))^j

for j=0,1,2,…

60
Q

G(z,t)

Simple birth process (Yule process)

linear birth rate
λ_j = jλ

state space
S = {N, N + 1, . . . , ∞}
X(0)=N>0

MEAN

(z^N exp(-λNt))/
[1-z(1-exp(-λt))]^N

A

E(X(t)) = ∂_z G|_z=1 = NexP(λt)

says grows exponentially

61
Q

G(z,t)

Simple birth process (Yule process)

linear birth rate
λ_j = jλ

state space
S = {N, N + 1, . . . , ∞}
X(0)=N>0
VARIANCE

(z^N exp(-λNt))/
[1-z(1-exp(-λt))]^N

A

var(X(t)) =
Nexp(2λt)(1 − exp(−λt)

62
Q

It is often difficult to obtain an expression for and then to get p_j(t) from it so
COMPUTE THE FIRST MOMENTS OF X(t)

mean

A

1st (raw) moment is the mean
E(X(t)) = ∂_z G|_z=1

m(t) ≡ E(X(t)) = Σ_j∈S jpⱼ(t)

63
Q

It is often difficult to obtain an expression for and then to get p_j(t) from it so
COMPUTE THE FIRST MOMENTS OF X(t)

from the master equation:

Variance

A

var(X(t)) = m_2(t) − m^2(t)
SECOND MOMENT - [FIRST MOMENT]^2

second moment is
m_2(t) ≡ E(X²(t)) = Σ_j∈S j²pⱼ(t)

Hence, when t is large the standard deviation from the mean grows exponentially as exp(λt)

64
Q

from the master equation:
find the first moment

(d/dt)pⱼ(t)
= λ[(j − 1)pⱼ₋₁(t) − jpⱼ ],

for j = N, N + 1, . . . ,

(d/dt)pⱼ(t)=0
for j=0,1,…,N-1

Since X(t) = j ∈ {N, N + 1, . . . },
j can never be < N
Equivalently: p_j<N = 0

A

E(X(t)) = ∂_z G|_z=1

dm/dt
=Σⱼ₌₀∞ [j dpⱼ/dt]

from ME subbing dpⱼ/dt

= λΣⱼ₌₀∞[j(j − 1)pⱼ₋₁]− λΣⱼ₌₀∞[j²pⱼ ],

shifting j-1 into j
=λΣⱼ₌ₙ∞[j(j +1)pⱼ]− λE[X²]
=[λ-λ] m_2(t) + λm(t)

thus
m^DOT = λm

so mean
m(t)= Nexp(λt)

on average, the population grows exponentially in time

65
Q

from the master equation:
find the second moment

A

m_2(t) ≡ E(X²(t)) = Σ_j∈S j²pⱼ(t)

m^DOT_2
=Σ_j∈S j²(d/dt)pⱼ(t)
from ME
=λΣⱼ₌ₙ∞[j²(j − 1)pⱼ₋₁]− λΣⱼ₌ₙ∞[j³pⱼ ],

shifting j-1 to j
=λΣⱼ₌ₙ∞[(j+1)²jpⱼ₋₁]− λE[X³],

=(λ-λ)m_3 +2λm_2 + λm

thus
m_2^DOT =2λm_2(t) + λm(t)
=λ[2m_2(t) +Nexp(λt)]

with
m_2(0) =N^2

giving
m_2(t) = (Nexp(t))^2 + λNexp(2λt)*integral_[0,t] exp(-λs).ds
=m(t)^2 +N(exp(2λt)-exp(λt))

Hence, when t is large the standard deviation from the mean grows exponentially as exp(λt)

66
Q

overview
why are boundary conditions important?

A

conditions at boundaries are important and can lead to population
extinction or to its unbounded growth.

67
Q

Goal of this lecture: simple birth and death process of Question 3 of Example Sheet 4

A

missing recording :(

68
Q

Q3 EXample sheet 4

We consider consider a population of size given by a birth-and-death process
X(t) ∈ S = {0, 1, 2, . . . , ∞}
where t = [0, ∞)is the continuous time. The population evolves through simple birth and death events
rates
λ_j =jλ
BIRTH RATE λ>0

µ_j=jµ
death rate µ>0

IC X(0)=N>0
p_j(0)= δ_{j,N}

The state space is unbounded (semi-infinite) as there is no upper bound in the number of individuals
in the population. However, since λ_0 = µ_0 = 0
there can be no birth or death from state j=0, the
state j=0 is here an absorbing state;
j=0 is the only absorbing state of this birth-and-death process:

What is the master equation (ME) for this simple birth-and-death process?
INCLUDING CASE AT 0
(d/dt)pⱼ
= (λⱼ₋₁pⱼ₋₁(t) + µⱼ₊₁pⱼ₊₁(t) )-(λⱼ +µⱼ)pⱼ
GAIN TERMS - LOSS TERMS

for j ∈ {1, 2, . . . , N − 1}

A

if it reaches the state j=0, the population is extinct and does not recover. However, since the state space is
semi-infinite, absorption and extinction are not guaranteed!

As seen in lecture 8.II and section 8.5 of the lecture notes, the ME here is the set of balance quations

(d/dt)pⱼ
= (j-1)λpⱼ₋₁(t) + (j+1)µpⱼ₊₁(t) -j(λ +µ)pⱼ
GAIN TERMS - LOSS TERMS
birth from j-1 death from j+1 - loss term

for j ∈ {1, 2, . . . , N − 1}

subbing in the linear rates

coupled to a special equation at j=0 (absorbing boundary):

(d/dt)p_0 = µp_1(t)

hereλ_0=0 x λ=0
and p_j(t) = 0 of j <0

STATE j=0 ABSORBING so only one possible transition into it 1 into 0 µ_1

69
Q

Q3 EXample sheet 4
Can we solve the ME and find the p_j(t) , alternatively what can we do?

A

Finding explicitly is complicated, even though it is possible to find the
generating function. which turns out to be a rather complicated expression

G(z, t) = E(zˣ⁽ᵗ⁾)) = SUM_[j=0,∞]zʲ p_j (t)

70
Q

Q3 EXample sheet 4

for the mean
compute the first and second moments of the process from the ME.

(d/dt)pⱼ
= (j-1)λpⱼ₋₁(t) + (j+1)µpⱼ₊₁(t) -j(λ +µ)pⱼ
GAIN TERMS - LOSS TERMS
birth from j-1 death from j+1 - loss term

for j ∈ {1, 2, . . . , N − 1}

A

m(t) ≡ E(X(t)) = Σ_j∈S jpⱼ(t)

By multiplying both sides of the ME by j and summing overj ∈ S

m^dot(t)
= Σⱼ₌₀∞ j(d/dt)pⱼ
by ME
=λΣⱼ₌₀∞ j(j-1)λpⱼ₋₁(t) + µΣⱼ₌₀∞j(j+1)pⱼ₊₁(t) -(λ +µ)Σⱼ₌₀∞j^2pⱼ(t)
shifting second sum j+1 to k
=λΣⱼ₌₀∞(j+1)jpⱼ + µΣⱼ₌₀∞(j-1)jpⱼ - (λ+µ)m_2(t)
m_2 terms cancel out

=λ(m_2(t) +m(t)) + µ(m_2(t)-m(t))- (λ+µ)m_2(t)

so m^dot(t) = ( (λ-µ)m(t)

with m(0)=N

giving m(t)=Nexp( (λ-µ))t

as t tends to infinity
this tends to

infinity λ>µ
0 λ<µ
N λ=µ

dep on λ and µ

71
Q

Q3 EXample sheet 4
2nd moment

A

m_2(t)=E[X^2(t)]) = Σ_j∈S j^2pⱼ(t)

working this out in a similar way

NOTES…

m_2^dot (t) = 2(λ-µ)m_2 + (λ+µ)m(t)
using the sol for m(t) helps to see
= 2(λ-µ)m_2(t) + (λ+µ)Nexp((λ-µ)t)
m_2(0)=N^2

Solving this (inhomogeneous) linear ODE with standard method
E.g, using the “variation of the constant method” or a suitable integrating factors

72
Q

m_2^dot (t) = 2(λ-µ)m_2(t) + (λ+µ)Nexp((λ-µ)t)
m_2(0)=N^2

Solving this (inhomogeneous) linear ODE with standard method
E.g, using the “variation of the constant method” or a suitable integrating factors

A

Notes show
m_2(t)=

N^2 exp(2(λ−µ)t) + exp(2(λ−µ)t)*integral_[0,t] exp(2(λ−µ)s) N((λ+µ)exp((λ−µ)s).ds

INTEGRATING FACTOR IS FIRST EXP IN INTEGRAL

=
{ m(t)^2 + N[( λ+µ)/(λ -µ) ] exp((λ -µ)t)( exp((λ -µ)t) - 1) if λ /= µ.
{ N^2 +2Nλt if λ = µ.

73
Q

Q3 EXample sheet 4 variance

A

var(X(t))
m_2 - m^2
=
{N([λ+µ]/[λ−µ]) exp(λ−µ)(exp((λ−µ)t) -1) if λ ̸= µ
{2Nλt if λ = µ

This mans that as t tends to infinity Var tends to
infinity when λ> µ
0 when λ <µ

74
Q

q3 ps4 summary behaviour
λ<µ:

λ>µ:

λ=µ:

A

cases:
both the average population size and fluctuations grow in time when λ> µ:
mean and standard deviations = √variance grow exponentially as ∼ exp((λ−µ)t)
⇒ possible extinction, but in general population grows when λ > µ and N is finite
but not certain, and extinction is unlikely (can occur with a very small probability) when N is finite but large

graph shows variance about exponential shape, growing as time increases, possible extinction for some paths

This confirms that the population eventually goes extinct λ <µ:

When λ = µ, mean is constant but standard deviations grow ∼ √λt =⇒ extinction is certain
occurs with probability 1

graph of X(t) for this case shows noisy about m(t), which is constant but then decrease to 0 as time goes on

75
Q

LEVEL 5 INTEREVENT TIMES
INTRO

A

In continuous-time Markov chains, the time between two successive events (intervent of waiting time is a random variable: how are the interevent times distributed? In the simple birth-and-death process, the state j=0 is absorbing=> Does this imply that extinction is guaranteed?

Goal of this lecture: general results on the interevent times in CTMCs and a theorem on the
extinction probability in birth-and-death processes

76
Q

LEVEL 5 INTEREVENT TIMES

T_i

A

In continuous-time Markov chains (CTMCs), the time between 2 sucessive events (birth, death, …)
i and i+1 is a random variable
Tᵢ

{Tᵢ}_i≥0
is the set of interevent, or waiting, times between events i and i+1

T_0 time to first event

77
Q

LEVEL 5 INTEREVENT TIMES

W_i

A

Wᵢ time at which event or transition i occurs
ie a birth or death happens at precisely these times

Tᵢ=W_{i+1} - Wᵢ

figure for sample path in BD process
Times between events
steps some longer some shorter some ascent some descend birth v death
T_1=W_2-W_1

78
Q

L5
What is the intervent times distribution ?

A

General result: the interevent (waiting) times in CTMCs are exponentially distributed
random variables

For the Poisson process (Sec. 8.3) we found that the interevent times were exponentially distributed.
p_0 = exp(-lambda t) when starting at state 0
meaning the probability that you stay at state 0 vanishes exponentially with time

79
Q

L5
General result: the interevent (waiting) times in CTMCs are exponentially distributed
random variables

PROBABILITY OF TRANISTION OF X moving
n → j ̸= n in ∆t is
Prob{Tᵢ ≤ ∆t}

A

considering a CTMC X(t) of range A
X(Wᵢ) =n ∈ S

PROBABILITY OF TRANISTION OF X moving
n → j ̸= n in ∆t is

Prob{Tᵢ ≤ ∆t} = Σ_{j̸=n,j∈S} pⱼₙ(∆t)

(sum of probabilities of all the moves away)

we have that
pⱼₙ(∆t) = qⱼₙ∆t + o(∆t) (something vanishingly small we neglect)
qₙₙ(∆t)=- Σ_{j̸=n,j∈S} qⱼₙ
(transition probability of staying at n, negative of the sum of the other as have to add to 0?
PROPERTY OF GENERATOR MATRIX SUM OF ALL ELEMENT OF A OLUMN ADD TO 0)

so this becomes
Prob{Tᵢ ≤ ∆t} = Σ_{j̸=n,j∈S} qⱼₙ∆t + o(∆t) = |qₙₙ|∆t+o(∆t)

80
Q

L5
General result: the interevent (waiting) times in CTMCs are exponentially distributed
random variables

PROBABILITY OF TRANISTION OF X not moving in ∆t
Prob{Tᵢ > ∆t}

A

Prob{Tᵢ > ∆t} = 1- Prob{Tᵢ ≤ ∆t}
=1- |qₙₙ|∆t+o(∆t)

+term that is negligible as ∆t tends to 0

81
Q

L5
General result: the interevent (waiting) times in CTMCs are exponentially distributed
random variables

pₙₙ(2∆t)
probability of___

A

pₙₙ(2∆t)
probability of being in state n still after 2∆t
with ∆t → 0 is

pₙₙ(2∆t) = pₙₙ(∆t)pₙₙ(∆t)= p²ₙₙ(∆t)

( prob. to be in state n after ∆t multiplied by the probability
of staying in that state for another ∆t)
AS TIME IS HOMOGENEOUS

82
Q

L5
General result: the interevent (waiting) times in CTMCs are exponentially distributed
random variables

slice time t
into t/∆t

pₙₙ(t)=exp(-|qₙₙ|t)

A

Waiting time in CTMCs is an exponentially distributed random variable
probability that you remain in state n is vanishingly small with exponential term

82
Q

L5
General result: the interevent (waiting) times in CTMCs are exponentially distributed
random variables

slice time t
into t/∆t

intervals ∆t

A

pₙₙ(t)=

pₙₙ((t/∆t)∆t)=

pₙₙ(∆t). pₙₙ(∆t)……..pₙₙ(∆t)
product of (t/∆t) times

=(pₙₙ(∆t))^{(t/∆t)}

= (1- |qₙₙ|∆t+o(∆t))^{t/∆t}

as ∆t→0
limit converges to
exp(-|qₙₙ|t)

so
pₙₙ(t)=exp(-|qₙₙ|t)

82
Q

L5
General result: the interevent (waiting) times in CTMCs are exponentially distributed
random variables

if pₙₙ(t)=exp(-|qₙₙ|t)

then

cumulative waiting times distribution
Fᵢ(t) ≡ Prob{Tᵢ ≤ t}

A

Fᵢ(t) ≡ Prob{Tᵢ ≤ t}
= 1 − Prob{Tᵢ = Wᵢ₊₁ − Wᵢ > t}
= 1 - exp(-|qₙₙ|t)

because
Prob{Tᵢ = Wᵢ₊₁ − Wᵢ > t}=
pₙₙ(t)=e−|qnn|t

Fᵢ is the cumulative waiting times distribution, with
Fᵢ(0)=0
Fᵢ(∞) = 1

83
Q

L5
General result: the interevent (waiting) times in CTMCs are exponentially distributed
random variables

cumulative waiting times distribution
Fᵢ(t) ≡ Prob{Tᵢ ≤ t}
= 1 - exp(-|qₙₙ|t)

OBTAIN THE PROBABILITY DENSITY

A

By defn differentiate wrt t
giving
PROBABILITY DENSITY

dFᵢ/dt
= (d/dt) [Prob{Tᵢ ≤ t}]
= (d/dt) [1 - exp(-|qₙₙ|t)]
= |qₙₙ| exp(-|qₙₙ|t)]

=> The probability to remain in state n, if X(0)=n, after time t decays exponentially with t
=> The probability to move into another state j not equal to n, if X(0)=n, after time t is exponentially close to 1

eventually will move away

84
Q

L5
General result: the interevent (waiting) times in CTMCs are exponentially distributed
random variables

probability density function
= |qₙₙ| exp(-|qₙₙ|t)]

cumulative waiting times distribution
Fᵢ(t) ≡ Prob{Tᵢ ≤ t}
= 1 - exp(-|qₙₙ|t)

what is

Prob{t₁ ≤ Tᵢ ≤ t₂}

A

Prob{t₁ ≤ Tᵢ ≤ t₂}=
Fᵢ(t₂)- Fᵢ(t₁)
=
exp(-|qₙₙ|t₁) - exp(-|qₙₙ|t₂)

difference between two exoonentials from cumulative

85
Q

L5 our findings are summarised in this thm 8.1

SHOWN in lectures and in notes

Exponential distribution of waiting times).

A

Let {X(t) ∈ S : t ∈ [0,∞)}
be a CTMC of transition matrix P(t) = (pⱼₖ(t)) and generator matrix Q = (qⱼₖ), with
j, k ∈ S and,

Σ_{ j̸=n} pⱼₙ(∆t)
= |qₙₙ|∆t + o(∆t),

∀n ∈ S.
Given X(Wi) = n, the **waiting time {T_i = W_{i+1} − W_i}i≥0 is an exponential random
variable with parameter |qₙₙ| and

cumulative distribution function
Fᵢ(t) = Prob {Tᵢ ≤ t}
= 1 − exp(−|qₙₙ|t)

The **mean **and variance of Tᵢ are
E(Tᵢ) = |qₙₙ|⁻¹
and
var(Tᵢ) = |qₙₙ|⁻²

86
Q

difference between transition matrix and generator matrix

A

transition matrix P(t) = (pⱼₖ(t)) and generator matrix Q = (qⱼₖ), with
j, k ∈ S and,

Generator matrix
ransition rates between different states of the Markov process.
q_jk rate of transitioning from state k to state j?? his notes
with q_jj defined so rows sum to 0
q_jj= - sum_{k /=j} q_jk
rate of change for markov process dX(T)=QX(t)dt

transition marix*
P describes probabilities of moving between states over time
probability of going from state given process starts
pjk from k to j?
satsifies (d/dt)P(t) = QP(t)

P(t)=exp(Qt)
matrix exponential which can be hard to find

87
Q

L5 our findings are summarised in this thm 8.1

applied to poisson process seen earlier

A

For the Poisson process, Sec. 8.3, we have
|q_nn| = λ =⇒

E(T_i) = 1/λ,
var(T_i) = 1/λ^2,

F_i(t) = 1 − exp(λt)

88
Q

L5:
A theorem on the probability of extinction in birth-and-death processes
Is the absorbing state X=0 always reached and the population always goes extinct?

what happens when λ ≥ µ > 0

A

we have seen that for the simple birth-and-death process
X(t)∈ {0, 1, . . . , ∞} : t ∈ [0, ∞)
birth rate λ_j = jλ
death rate µ_j = jµ

as t→∞
E[X(t)]→ ∞
Var(X(t))→ ∞
when λ ≥ µ > 0

birth capita birth rate exceeds per capita death rate

absorbing state X=0 always reached?
Possible but not certain: due to fluctuations from random birth and death events => standard deviations
grow in time and can exceed the mean of X(t) causing extinction, but unbounded growth is also possible
MEAN GROWS BUT DEVIATIONS ABOUT IT ALSO GROW

89
Q

L5:
in simple birth and death process
p₀(∞) CASES

also as t → ∞ what happens to X(∞) ?

A

probability of extinction

depends on whether the state space is bounded and X(0)

when t → ∞:
X(∞) = 0 with prob. p₀(∞)
EXTINCT
and

X(∞) → ∞ with prob. 1 − p₀(∞)
GROWS WITHOUT BOUND

goes extinct or grows without bound
probabilites might not be 0 and 1 might be between

90
Q

L5:
General result stated as
p₀(∞)

EXTINCTION certain?impossible? possible? cases

A

if S= {0, 1, 2, . . . , ∞} is UNBOUNDED
µ₀ = λ₀ = 0 the state 0 is absorbing
for X(0)=n≥ 1
we have

EXTINCTION CERTAIN
p₀(∞)=1

if Σᵢ₌₁∞ [µ₁µ₂…µᵢ]/[λ₁λ₂…λᵢ]= ∞ DIVERGES

EXTINCTION NOT CERTAIN but possible
p₀(∞)=
[Σᵢ₌₁∞ [µ₁µ₂…µᵢ]/[λ₁λ₂…λᵢ]]/
[1+Σᵢ₌₁∞ [µ₁µ₂…µᵢ]/[λ₁λ₂…λᵢ]]

if
Σᵢ₌₁∞ [µ₁µ₂…µᵢ]/[λ₁λ₂…λᵢ] < ∞ FINITE

EXTINCTION ALMOST IMPOSSIBLE
p₀(∞)≈0

if Σᵢ₌₁∞ [µ₁µ₂…µᵢ]/[λ₁λ₂…λᵢ] < ∞ AND INITIAL POP LARGE n»1

FINITE AND LARGE POP

91
Q

L5:Theorem 8.2:
birth and death process

FOR THE SIMPLE BIRTH AND DEATH PROCESSES ON
S= {0, 1, 2, . . . , ∞}

A

p₀(∞)=
{1 if µ ≥ λ
{(µ/λ)ⁿ if µ < λ

q3 ps4!!

you should know how to sum geometric progressions

92
Q

L5:Theorem 8.2:
birth and death process
what happens on finite state space? with absorbing state 0?

A

S = {0, 1, 2, . . . , N}
µ₀ = λ₀ = 0

we have
state N not absorbing then
λₙ = 0, µₙ > 0 (⇒ state N is not absorbing) and X(0) = n ≥ 1
p₀(∞)=1
meaning EXTINCTION ALWAYS CERTAIN
will end up reaching the absorbing state 0
might be after a long time

93
Q
A