05 Decision and Cost-Effective Analysis Flashcards

1
Q

What does a decision tree do?

A

Characterizes transition from one health state to another

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What does a Markov model do?

A

Characterizes transition among multiple states over time

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What are the Markov Modeling steps?

A

Delineate a set of possible health states. Determine transitions during a cycle. Assign transition probabilities. Assigns incremental utility

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What is the terminology for the Markov Model?

A

Going from one state to another are “State Transitions”. Ending up in the dead category is the “Absorption State”

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

How are calculations done in a P-Matrix?

A

Assume utility of “well” = 1, “sick” = 0.60, and “dead” = 0. Multiplying the incremental utility for each state by the fraction of the cohort occupying that state and summing across all states yields the TOTAL incremental utility generated by the cohort for a given cycle. Cycles are then summed over time intervals to determine cumulative utility of a treatment option

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

What is a P-Matrix Table?

A

Sometimes probabilities change over time (e.g. risk of arthritis increases with age). Most Markov software uses a database table to store individual P-matrices. This allows for changing probabilities

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

How do you compute transition probabilities?

A

Transition probabilities are risk of transition within a specified cycle length. It is not appropriate to divide published risk data by number of cycles (e.g. 20% 10 year risk does not mean 2% 1 year). Such calculations produce an unrealistic variable risk model

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

What is the Markov Theorem?

A

Theorem: Let “p” be the (current) probability vector and “M” be the transition matrix for a Markov chain. After “n” steps of the process, where n > 1, the new probability vector is given by pn = (M^n) x p

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

What is the Markov Assumption?

A

Everyone in a cohort that has made it to a particular health state has the same prognosis regardless of how or when they got there

How well did you know this?
1
Not at all
2
3
4
5
Perfectly