13. Probability Theory 1 Flashcards
What is uncertainty in AI?
The inability to predict an outcome due to incomplete, noisy, or complex data.
Why is reasoning under uncertainty important?
AI agents must make decisions with incomplete or uncertain knowledge, such as predicting traffic delays or medical diagnoses.
What is decision theory?
A framework that combines probability theory (beliefs) and utility theory (desires) to determine the best action.
What is Bayesian probability?
A framework that assigns probabilities to propositions based on an agent’s state of knowledge rather than absolute truth.
What are random variables?
Variables that represent uncertain aspects of the world and have a domain of possible values.
What are examples of random variables?
Boolean variables (e.g., “Is it raining?”), Categorical variables (e.g., “Temperature: Hot, Cold”), Continuous variables (e.g., “Wind speed in knots”).
What are the fundamental properties of probability?
(1) ∀x: 0 ≤ P (A = x) ≤ 1 (2) P(True) = 1, P(False) = 0
(3) ∑_x P (A = x) = 1
(4) ¬P (a) = 1 − P(a)
(5) ∀ a, b ∈ D: P (a ∨ b) = P (a) + P (b) − P(a, b)
What is a probability distribution?
A function describing the likelihood of all possible values a random variable can take.
What is an atomic event?
A complete specification of the state of all random variables in a given world.
What is a joint probability distribution?
A table listing the probability of every possible combination of random variable values.
What is marginal probability?
The probability of a single event occurring, computed by summing over relevant rows of a joint probability table.
What is conditional probability?
The probability that one event occurs given that another has occurred, denoted as 𝑃(𝑋 = 𝑎 | 𝑌 = 𝑏) or 𝑃(𝑎 | 𝑏)
What is the product rule of probability?
𝑃 (𝑎, 𝑏) = 𝑃(𝑎 | 𝑏) 𝑃 (𝑏) = 𝑃(𝑏 | 𝑎) 𝑃(𝑎)
What is the Law of Total Probability?
A rule that computes the probability of an event, A, by summing over conditional probabilities: P(A) = \sum_{n} P( a | b_{n}) P(b_n)
What is Bayes’ Rule?
A fundamental theorem stating: P(a | b) = \frac{P(b)}{P(b | a)} P(a)
Why is Bayes’ Rule useful?
It allows reasoning from effects to causes, such as diagnosing a disease from observed symptoms.
What is an example application of Bayes’ Rule?
Determining the probability of having meningitis given that a patient has a stiff neck.
Why are causal models better than diagnostic models?
Because diagnostic knowledge is often more fragile than causal knowledge.
How does normalization help with probability calculations?
Instead of computing P(effect) directly, we normalize probabilities using the sum over all possible causes.