CHAPTER 5: Continuous time Markov chain models Flashcards
A continuous time stochastic process
A continuous time stochastic process is then a set of random variables X_t for real numbers t; often we assume time starts at 0 so that t ∈ R+
given known state at time 0
A continuous time stochastic process is a Markov chain
A continuous time stochastic process is a Markov chain if it has discrete states
and satisfies the condition: for
t_n bigger than t_{n-1} bigger than … bigger than t_1 ≥ 0,
P{X_{t_n} = i_n | X_{t_{n−1}} = i_{n−1}, . . . X_{t_1} = i_1} =
P{X_{t_n} = i_n | X_{t_{n−1}} = i_{n−1}}
for each n ≥ 2
- maintain current value for random length of time and then jump to another value
TIME HOMOGENEOUS
P(X_{t+τ} = j | X_t = i) = p_{ij} (τ )
not depending on t
For a continuous-time chain the Chapman-Kolmogorov equations are,
For a continuous-time chain the Chapman-Kolmogorov equations are,
pij (t + τ ) = SUM over k of
[p_{ik}(t)p_{kj}(τ )]
which may be written in matrix terms as
P(t + τ ) = P(t)P(τ )
where P(t) = (p_{ij}(t))
For a continuous-time chain :
matrix terms of chapman kolmogorov
P(t + τ ) = P(t)P(τ )
extending the chapman kolmogorov: we apply multiple times ie we only need to know how TM behaves for very small time intervals
because of this if we knew the value of P(t) for all t in a short
interval [0, δt] then we would be able to find it for any larger t too because the C-K equations would allow successive extensions to [0, 2δt], [0, 4δt] etc
we should be able to parametrize the chain in terms of the transition
probabilities over arbitrarily short times. As δt → 0 it is reasonable to insist that
P(δt) → I
(the unit matrix)
we focus on how pij → 0 (i ≠ j) and pii → 1.
*the zero step transition matrix should be identity- as 1 to 1 and any other state to itself with prob 1
Matrices For a continuous-time chain :
in discrete time we had nth powers of TM but in this case
we cant use this fact, we cant describe the chain by the one step TM
we can only apply the chapman kolmogorov
Suppose that for
δt very small the probability of a transition from i to j is
p_ij (δt) = g_{ij}δt + o(δt)
i ≠ j,
(infinitesimal transition scheme)
ie for small time step, if currently in state i transition prob tends to 0 as interval tends to 0 in linear way
ie constant (derivative) * δt
o(δt) (last term goes to 0 faster than δt goes)
Suppose that for
δt
pii(δt) = 1 + g_{ii}δt + o(δt)
(infinitesimal transition scheme
transition prob of i to itself
linear terms + term going to 0 faster than δt
(g_ii will be negative to ensure probs tend to 1)
where the gij are constants.
DEF: generator matrix of the continuous Markov chain
Matrix G with elements g_ij is the (infinitesimal) generator of the Markov chain
ie tends to I
st p_ij (δt) = g_{ij}δt + o(δt) pii(δt) = 1 + g_{ii}δt + o(δt)
diagonal and off-diag elements
(When a continuous time Markov chain is used for modelling, it is usually not the transition probabilities pij (t) that are specified, but the generator.)
Properties of generator matrix of the continuous Markov chain
its possible to find chains which break
**Σ_j [p_ij(δt)] = 1 as rows of TM
we expect that
pii(δt) = 1 - Σ_{i ≠ j} [g_{ij}δt + o(δt)]
ie pii(δt) = 1 - sum of non-diagonal entries
= 1 - Σ_{i ≠ j} [g_{ij}]δt + o(δt)
so g_ii= -Σ_{i ≠ j} [g_{ij}]
(valid if first sum is finite)
will certainly be true for chains with finite state space. In general it is
not guaranteed. However, most chains used in modelling satisfy it. They are called
conservative chains
A natural way to think about gij for j ≠ i is as the instantaneous rate at which, given
that the chain is in state i, a transition to state j is expected to occur.
Conservative chains
Satisfy g_ii= -Σ_{i ≠ j} [g_{ij}]
EXAMPLE 22 A two state chain (continuous)
Let S = {1, 2} and set
G=
[−α α
[ β −β]
α, β bigger than 0
(no constraints on being less than one but off diagonals non negative and diagonals negative)
G is the generator matrix for a continuous time Markov chain with two
states. (As previously, we could think of this as a weather model, with the states being “wet”
and “dry”.)
ie its raining now will it continue to rain for some time
EXAMPLE 23 The Poisson process (continuous)
Consider N_t, # events in (0,t] in a Poisson process with rate λ.
Consider N_t, # events in (0,t] in a Poisson process with rate λ.
P(Nt+δt = n + 1|Nt = n)
= λδt + o(δt)
and that
P(Nt+δt > n + 1|Nt = n) = o(δt)
(rows sum to 0)
we can consider a Poisson process as a continuous time Markov chain on N_0 with generator given by g_{n,n+1} = λ, g_{n,n} = −λ and other entries zero
The chain is not irreducible: values can only increase, so it is not possible to get from state i to state j if j less than i.
EXAMPLE 24 Linear birth process
and d so each population member reproduces at rate λ
Consider pop where each member reproduces at rate λ
each member of pop has probability of a reproduction event between times t and t + δt is
λt + o(δt))
(no other ways pop can change)
Then letting N_t be the size of the population at time t, P(N_{t+δt} = i + 1|N_t = i) = λ_iδ_t + o(δ_t) and P(N_{t+δt} = i|N_t = i) = 1 − λ_iδ_t + o(δ_t), with P(N_t+δt = j|N_t = i) being o(δt) if j is neither i nor i + 1
continuous markov chain : g_{ij} = {λi j=i+1 {-λi j=1 {0 o/w
(not irreducible as cant leave state 0)
poisson processes
are not irreducible
EXAMPLE 25: linear birth-death process
and d so each population member reproduces at rate λ
and dies at rate µ
Extend example 24: allow deaths so each population reproduces at rate λ:
P(Nt+δt = i + 1|Nt = i) =
λiδt + o(δt),
and dies at rate µ giving: P(N_{t+δt} = i − 1|N_t = i)
= µ_iδt + o(δt)
P(Nt+δt = i|Nt = i) = 1 − (λ + µ)iδt + o(δt)
(still not irreducible as cant leave state 0)
continuous MC: g_ij = {λ_i , j = i + 1 {−(λ + µ)i 'j = i {µi ,j = i − 1 {0 otherwise.
EXAMPLE 26: M/M/n queue
Imagine that we have a queue with n servers, and assume that:
• Arrivals of customers occur as a Poisson process of rate λ.
• There are n servers, who can each serve one customer at a time, and will automatically
move on to the next customer in the queue when their current customer leaves.
• Each customer currently being served completes their service at rate µ, so that the probability they leave in (t, t + δt] is µδt + o(δt).
Let N_t= number in the queue at time t.
*If f i ≤ n, then in (t + δt], we may have an
arrival, which is an event of a Poisson process, so has probability λδt + o(δt), or we may have a
service completion, which happens with probability µiδt + o(δt) as there are i customers being served, or we may have more than one event with probability o(δt)
gi,i+1 = λ, gi,i−1 = µi gii = −(λ + µi).
*if i is bigger than n thengi,i+1 = λ still but only n customers being served
g_i,i−1 = µn
and
g_ii = −(λ + µn)
This chain is irreducible.
EXAMPLE 27: Generalized birth-death process
prev examples generalized
This has transitions i →i + 1, λi δt + o(δt); → i, 1 − (λi + µi) δt + o(δt); → i − 1, µi δt + o(δt).
corresponding to gij = {λi if j = i + 1 {−λi − µi if j = i {µi if j = i − 1 {0 otherwise
continuous MC and transition probabilities
obtained from G
writing generator defs in matrix notation:
P(δt) − I = Gδt + o(δt),
δt → 0
(o(δt), goes to 0)
ie
limit as δt → 0 of
[ (P(δt)-I)/δt] = G
(one way of thinking about G is that its the derivative of t-step transition matrix at time 0)
S by S matrix or infinite state space
considering chain from starting state at time 0 to state at time t + δt
Chapman-Kolmogorov eq:
transitions over the period t + δt. expressed in terms of those between 0 and t and in final short time δt.
P(t + δt) = P(t) P(δt)
Then
[P(t + δt) − P(t)]/δt =
[P(t) (P(δt) − I)]/δt
Let δt → 0 and e limit of the left hand term is deriv P’(t) of P at time t.
ie P’(t) = P(t) G,
known as the forward differential equations.
backward differential equations from paths over 0, t+ into short (δt) and long (t)
sections rather than vice versa
P’(t) = G P(t).
note: generator matrix
*All continuous chains with finite state space rows sum to 0 (we will only see in course st true)
the forward differential equations.
P’(t) = P(t) G,
backward differential equations
P’(t) = G P(t).
Solutions of forwards and backward diff eqs
for finite chains:
the equations are matrix generalizations of the simple DE
y’(t)=g y(t) with constant g
solution: y(t)=y(0)e^{gt}
With P(0) = I, solution for the forward and backward equations
P(t) = exp(Gt),
where, for a matrix G, the exponential function is defined as
exp(Gt) =
Σ_{i=0,∞} [(Gt)^i/i!]