Discrete-time Markov chains Flashcards
1
Q
Markov chain
▪ Definition
A
A Markov chain is
▪ a stochastic process
▪ The past and the future are independent given the present
(conditionally on the present)
2
Q
Markov chain
▪ Stochastic process
A
A sequence (Xₙ) n ϵ N of random variables with values in a set S is called a discrete-time stochastic process with state space S