16. Bayesian Networks 2 Flashcards
What is the purpose of the variable elimination algorithm in Bayesian networks?
The variable elimination algorithm reduces the overall number of calculations by caching intermediate results, making exact inference more efficient.
What is the pointwise product of two factors f and g in Bayesian networks?
The pointwise product yields a new factor h whose variables are the union of f and g, with elements given by the product of corresponding elements in f and g.
What is the law of total probability?
P(A)=∑nP(A∣Bn)P(Bn).
What is Bayes’ rule?
P(C∣E)=\frac{P(E∣C)P(C)}{P(E)}.
What are the two key decisions that influence the complexity of the variable elimination algorithm?
Variable relevance: Remove variables not relevant to the query or evidence. Variable ordering: The order in which variables are eliminated (intractable problem).
How does the complexity of exact inference in Bayesian networks depend on the network topology?
Singly connected networks (polytrees): Linear time and space complexity. Multiply connected networks: Exponential complexity (NP-hard in general).
What is the purpose of approximate inference in Bayesian networks?
Approximate inference uses randomized sampling (e.g., Monte Carlo algorithms) to estimate probabilities when exact inference is computationally infeasible.
How does the direct sampling algorithm work?
- Start with variables having no evidence. 2. Generate samples by sampling from P(X_i \mid parents(X_i)) 3. Count equal samples to approximate the joint distribution.
What distinguishes causal Bayesian networks from general Bayesian networks?
Causal Bayesian networks enforce causally compatible relationships, making them more intuitive, simpler to represent, and better for predicting interventions.
What is the do-operator in causal Bayesian networks?
The do-operator imposes a specific outcome on a variable, removing its conditional dependencies, allowing prediction of interventions (e.g.,
P(C,R,s,J,T)=P(C)P(R∣C)P(J∣R)P(T∣R,s)).
Why are causal Bayesian networks important?
They allow prediction of how interventions will affect the model and are more intuitive for representing expert knowledge.
What is the main idea behind Monte Carlo algorithms in approximate inference?
They approximate the joint distribution by drawing multiple samples, with accuracy improving as the number of samples increases.
What is the Markov Condition in Bayesian Networks?
A node is conditionally independent of its non-descendants given its parents.
What is Conditional Probability in Bayesian Networks?
The probability of a variable given its parent nodes in the network.
How do we compute the joint probability distribution from a Bayesian Network?
By multiplying the conditional probabilities of each variable given its parents.
What are the three types of connections in d-separation?
Chain, common cause (diverging), and common effect (collider).
What is a collider in Bayesian Networks?
A node where two edges converge, which blocks information flow unless it is observed or a descendant is observed.
What are two main types of inference in Bayesian Networks?
Exact inference and approximate inference.
What is the Variable Elimination algorithm used for?
Performing exact inference by summing out variables to compute marginal probabilities.
What is the Junction Tree algorithm?
A method for exact inference that clusters nodes into tree structures to simplify computations.
What are common approximate inference methods?
Sampling methods like Gibbs Sampling and Monte Carlo methods.
What is the role of Conditional Independence in Bayesian Networks?
It allows simplification of probability calculations by reducing dependencies.