UNCERTAINTY Flashcards
can be represented as a number of events and the likelihood, or probability, of each of them happening.
Uncertainty
Axioms in Probability
0 < P(ω) < 1: every value representing probability must range between 0 and 1.
The probabilities of every possible event, when summed together, are equal to 1.
the degree of belief in a proposition in the absence of any other evidence.
Unconditional probability
the degree of belief in a proposition given some evidence that has already been revealed
Conditional probability
a variable in probability theory with a domain of possible values that it can take on
random variable
is the knowledge that the occurrence of one event does not affect the probability of the other event.
independence
P(a ∧ b) = P(a)P(b)
independence
P(a | b)
conditional probability
commonly used in probability theory to compute conditional probability
Bayes’ rule
In words, Bayes’ rule says that the probability of b given a is equal to the probability of a given b, times the probability of b, divided by the probability of a.
Bayes’ rule
the likelihood of multiple events all occurring
Joint probability
Use to deduce conditional probability
joint probability
P(¬a) = 1 - P(a)
negation
P(a ∨ b) = P(a) + P(b) - P(a ∧ b)
Inclusion-Exclusion
P(a) = P(a, b) + P(a, ¬b)
Marginalization
P(a) = P(a | b)P(b) + P(a | ¬b)P(¬b)
Conditioning
data structure that represents the dependencies among random variables.
bayesian network
Bayesian networks have the following properties:
- They are directed graphs.
- Each node on the graph represent a random variable.
- An arrow from X to Y represents that X is a parent of Y. That is, the probability distribution of Y depends on the value of X.
- Each node X has probability distribution P(X | Parents(X)).
Inference has multiple properties:
- Query X
- Evidence variables E
- Hidden variables Y
- The goal: calculate P(X | e)
a process of finding the probability distribution of variable X given observed evidence e and some hidden variables Y.
inference by enumeration
one technique of approximate inference
sampling
each variable is sampled for a value according to its probability distribution
sampling
likelihood weighting steps:
- Start by fixing the values for evidence variables.
- Sample the non-evidence variables using conditional probabilities in the Bayesian network.
- Weight each sample by its likelihood: the probability of all the evidence occurring.
the probability of all the evidence occurring.
likelihood