CHAPTER 1: Intro Flashcards

1
Q

Markov property

A

given the present, future independent of past

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Discrete variables Y_1, Y_2,..,Y_n form a discrete markov chain if

A

*P_{y_{k-1},y_k} = P( Y_k =y_k | Y_{k-1}= y_{k-1})

(y_{k-1}, y_k)^th element in the transition probability matrix (one-step probability)

  • Y_k, Y_{k-1} RVs taking values in state space S, states of process at times k and k-1
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

We can use observations of Y_1,.., Y_n to learn about *discrete time MC, 2 states with unknown transition matrix
[θ_1 1-θ_1]
[1-θ_2 θ_2 ]

A

and unknown parameters bold(θ) = (θ_1, θ_2),
parameters

θ_i is the probability of staying at state i (at next time step) given the current state is i
1-θ_2 is the probability of state 1 given current is state 2
1-θ_1 is the probability of state 2 given current is state 1

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

LIKELIHOOD

LOG LIKELIHOOD

A

L( bold(θ)(y_1,..,y_n))
= P(Y_1=y_1,…, Y_n=y_n| bold(θ))

probability of particular sequence of observation to see how likely parameters are

log likelihood:

l =log L

How well did you know this?
1
Not at all
2
3
4
5
Perfectly