Continuous-time Markov Chains Flashcards

1
Q

Continuous-time Markov chains
▪ Definition

A

Let (X(t))ₜ˲₀ continuous-time stochastic process with countable state space S if:
▪ for all n⩾0
▪ all times 0⩽s₁<s₂<…<sₙ<s<t
▪ all states i₁,i₂,…,iₙ,i,t ϵ S
it holds that

P(X(t+s)=j|X(s)=i, X(sₙ)=iₙ,…, X(s₁)=i₁) = P(X(t+s)=j|X(s)=i)

then this stochastic process is a continuous-time Markov chain

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Continuous-time Markov chains
▪ Components

A

▪ Jump Chain
A discrete time Markov chains that gives us the transition probabilities
▪ Holding time
For each state we have an exponentially distributed holding time that controls the amount of time spent in each state

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Continuous-time Markov chains
▪ Construction

A

Consider
▪ a discrete-time Markov chain (jump chain)
▪ on a countable state space
▪ having transition matrix P
▪ with pᵢᵢ=0 for all non-absorbing state i ϵ S
▪ let (vᵢ)ᵢϵs be a collection of a positive and finite real numbers (holding time parameters)

A continuous-time stochastic process (X(t))ₜ˲₀ such that:

① if X(t) = i, the times until the stages changes has distribution Exp(vᵢ)
② if X(t) = i the next state will be j with probability pᵢj

is a continuous-time Markov chain (i.e. it satisfies Markov property (MP))

(Holding times relative to different states are independent)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Continuous-time Markov chains
▪ Transition probability functions

A

Let (X(t))ₜ˲₀ continuous-time Markov chain on a countable state space S
Let
pᵢj = P(X(t+s)=j | X(s) = i)
a transition probability function equal to
pᵢj = P(X(t)=j | X(0) = i) for all s,t⩾0
denote the probability that the process presently in state i will be in state j at a time t later

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Continuous-time Markov chains
▪ Transition matrix

A

For any t⩾0 we can define the transition matrix

P(t)=(pᵢj(t))ᵢ,jϵs

Properties
i) for every t⩾0, P(t) is a stochastic matrix
ii) P(0) = I (identity matrix)
iii) for every s,t⩾0,
P(t+s) = P(t)P(s) = P(s)P(t)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Continuous-time Markov chains
▪ Stationary and limiting distribution

A

(As in discrete-time chains, for “nice” chains a unique stationary distribution exists and it is equal to the limiting distribution)

Let (X(t))ₜ˲₀ continuous-time Markov chain on a countable state space S and with transition matrix P(t)
A probability distribution
π = (πᵢ)ᵢϵs (row vector)
is a stationary distribution for (X(t))ₜ˲₀ if
π = πP(t) for all t⩾0

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Continuous-time Markov chains
▪ Definition of limiting distribution

A

A probability distribution
π = (πᵢ)ᵢϵs
is the limiting distribution of the continuous-time Markov chain
(X(t))ₜ˲₀
if
πj = lim P(X(t)=j | X(o)=i)
t → ∞

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Continuous-time Markov chains
▪ The generator matrix
Transition rate - Definition

A

Consider a continuous-time Markov chain (X(t))ₜ˲₀
Assume X(o) = i
The chain will jump to the next state at time T₁ where:
▪ T₁~Exp(vᵢ)
▪ vᵢ –> holding time parameter at state i

For a very small h we have:

P(T₁⩽h) = 1 - e^(-vᵢh) = vᵢh + o(h)
(Taylor expansion)

Thus, in a short time interval of length h, the probability of leaving state 1 is approximately vᵢ*h
For this reason vᵢ is often called the transition rate out of state i

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Continuous-time Markov chains
▪ The generator matrix
Transition rate - Formally

A

Formally we can write the transition rate as

vᵢ = lim P(X(h)!=i | X(0) = i) / h
h–>0

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Continuous-time Markov chains
▪ The generator matrix
Transition rate from state i to state j

A

Since we go from state i to state j with probability Pᵢj
we call the transition rate from state i to state j the quantity

gᵢj = vᵢ * Pᵢj

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Continuous-time Markov chains
▪ The generator matrix
Definition

A

Let (X(t))ₜ˲₀
- continuous-time Markov chain on a countable state space S
- having embedded (jump) chain with transition matrix P
- holding time parameters vᵢ → with i ϵ S

The generator matrix
G = gᵢ,j → with i,j ϵ S
has entries:

gᵢ,j:
▪ vᵢ * Pᵢj if i!=j
▪ -vᵢ if i=j

Remark: The rows of the generator matrix G sum to 0

Generation matrix is useful to analyze continuous-time Markov chains

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Continuous-time Markov chains
▪ The generator matrix
Lemma (proposizione preliminare)

A

Let (X(t))ₜ˲₀
- continuous-time Markov chain with transition matrix P(t)
- holding time parameters vᵢ –> with i ϵ S
Then we have (if i!=j)
▪ lim (Pᵢᵢ(h) - 1) / h = gᵢᵢ
h→0 and
lim Pᵢj(h) / h = gᵢj
h→0
▪ Equivalently
Pᵢᵢ(h) = 1 - vᵢ * h + o(h) and
Pᵢj(h) = gᵢj + o(h)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Continuous-time Markov chains
▪ The generator matrix
Find the Stationary distribution using the generation matrix

A

Consider a continuous-time Markov chain (X(t))ₜ˲₀ with
- state space S
- generator matrix G
The probability distribution π on S is a stationary distribution
for (X(t))ₜ˲₀ if and only if πG = 0

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Continuous-time Markov chains
▪ The generator matrix
Interpretation of πG = 0

A

In the long run the rate at which transitions into state i occur must equal the rate at which transitions out of state j occur

Indeed, we have: (sommatoria con k diverso da j)

πG = 0 ↔
∑ gₖj * πₖ - vj * πj = 0 ↔
vj * πj = ∑ gₖj * πₖ

Where
▪ vj * πj = rate of which the process leaves state j
as:
→ πj = the amount of time the process spends time in state j
→ vj = the rate at which the process leaves state j
▪ ∑ gₖj * πₖ = rate at which the process enters state j
as:
→ πₖ = the amount of time the process spends in state k
→ gₖj = the rate at which transitions from state k to state j occur

How well did you know this?
1
Not at all
2
3
4
5
Perfectly