CS2 - Part 1&2 Flashcards
Explain what it means for a Markov chain to be periodic with period d
- A state in a Markov chain is periodic with period d>1 if a return to that state is possible only in a number of steps that is a multiple of d
- A Markov chain has period d if all the states in the chain have period d
- If a Markov chain is irreducible, all states have the same periodicity
Irreducible Markov chain
- A Markov chain is said to be irreducible if any state j can be reached from anz other state i
Mathematical definition of the Markov property
Condition for unique stationary distribution
- Finite number of states (so it has at least one stationary distribution)
- Irreducible (so it has a unique stationary distribution)
Stationarity of stochastic process
If the statistical properties of a process do not vary over time, the process is stationary.
Stationarity
If the statistical properties of a process do not vary over time, the process is stationary.
Weak stationarity
E(Xt) and var(Xt) are constant
cov(Xt1, Xt2) depends only on the lag t2 - t1
White noise
White noise is a stochastic process that consists of a set of independent and identically distributed random variables. The random variables can be either discrete or continuous and the time set can be either discrete or continuous. White noise processes are stationary and have the Markov property
Poisson process
Chapman-Kolmogorov equations
Survival probability
Joint distribution of waiting time Vi and Death indicator Di
Maximum likelihood estimate of µ
Poisson model mortality
Maximum likelihood estimator Possion model
Distribution of the waiting times between consecutive events of a Poisson process Nt ~ Poi (lambda*t)
Wt~Exp(lambda)
Chi-squared test to test goodness of fit of probability distribution
X^2 = (A-E)^2 / E
degrees of freedom: n - p
where
- n = number of categories
- p = 1 + number of variables in the probability distribution
e.g.
- Exponential distribution: n = 2 (lambda)
- Normal distribution: n = 3 (mean and std)
Occupancy probability for Markov jump processes (time-homogeneous and -inhomogeneous)
Integrated form of the Kolmogorov backward equation (time-inhomogeneous)
check Chapter 5 Section 7 for explanation
Integrated form of the Kolmogorov forward equation (time-inhomogeneous)
check Chapter 5 Section 8 for explanation
Kolmogorov equations (forward and backward)
time-inhomogeneous
Number of possible triplets in a Markov jump process
For each state X, the number of possible triplets with X as second state is
number of ways into X * number of ways out of X
Fx(t) as a function of F(t) and S(t)
(F(x+t) - F(x)) / S(x)
Sx(t) as a function of F(t) and S(t)
Sx(t) = S(x+t) / S(x)
Central rate of mortality
Expected future lifetime after age x