14. Probability Theory 2 Flashcards

1
Q

What is probabilistic inference?

A

The process of computing the probability of a query given known probabilities and evidence.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

How can probabilities be updated?

A

By incorporating new evidence using conditional probabilities and Bayes’ Rule.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What is a joint probability query?

A

A query that asks for the probability of multiple variables occurring together (e.g., P(A,B)).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What are the three types of variables in probabilistic inference?

A

Query variables, evidence variables, and hidden variables.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

What is inference by enumeration?

A

A method for computing probabilities by summing over all possible values of hidden variables. P(Q,E)=∑_{H} P(Q,H,E).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

What is the worst-case complexity of inference by enumeration?

A

Time complexity: O(d^n) , Space complexity: O(d^n), where d is the domain size and n is the number of variables.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

What is the chain rule of probability?

A

A formula that expresses a joint probability as a product of conditional probabilities: P(X_1, X_2, …, X_n) = P(X_1) P(X_2 | X_1) P(X_3 | X_1, X_2) … P(X_n | X_1, X_2, …, X_{n-1}) ]

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

What is a conditional probability query?

A

A query that asks for the probability of an event given some evidence (e.g., P(A | B))

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

How is a conditional probability computed?

A

By normalizing the joint probability: P(Q | E) = \frac{P(Q, E)}{P(E)}

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

What is marginalization?

A

The process of summing over hidden variables to obtain the probability of a subset of variables. P(X) = \sum_Y P(X, Y)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

What are sensitivity and specificity in probabilistic inference?

A

Sensitivity measures the percentage of actual positives correctly identified, and specificity measures the percentage of actual negatives correctly identified. P(effect) = sensitivity p(cause) + (1- specificity)(1 - p(cause))

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

How does Bayes’ Rule help in medical diagnosis?

A

It allows calculation of the probability of a disease given a positive test result by considering both false positives and false negatives.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Why is causal knowledge more reliable than diagnostic knowledge?

A

Causal probabilities (e.g., P(Symptom | Disease)) remain stable, while diagnostic probabilities (e.g., ( P(Disease | Symptom)) may change due to external factors.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly