Uncertainty Flashcards
Uncertainty can be represented as a number of events and the likelihood, or probability, of each of them happening.
Probability
Axioms in Probability
0 < P(ω) < 1
the degree of belief in a proposition in the absence of any other evidence.
Unconditional Probability
the degree of belief in a proposition given some evidence that has already been revealed.
Conditional Probability
variable in probability theory with a domain of possible values that it can take on
Random Variable
the knowledge that the occurrence of one event does not affect the probability of the other event
Independence
commonly used in probability theory to compute conditional probability.
Bayes’ Rule
the likelihood of multiple events all occurring.
Joint Probability
Probability Rules
- Negation
- Inclusion-Exclusion
- Marginalization
data structure that represents the dependencies among random variables.
Bayesian Networks
a process of finding the probability distribution of variable X given observed evidence e and some hidden variables Y.
Inference by Enumeration
Properties of Inference
Query X: the variable for which we want to compute the probability distribution.
Evidence variables E: one or more variables that have been observed for event e.
Hidden variables Y: variables that aren’t the query and also haven’t been observed.
The goal: calculate P(X | e).
a scalable method of calculating probabilities, but with a loss in precision.
approximate inference
a technique of approximate inference where each variable is sampled for a value according to its probability distribution
Sampling
Likelihood Weighting vs Sampling
Sampling is inefficient because it discards samples. Likelihood weighting addresses this by incorporating the evidence into the sampling process.
Likelihood Weighting Steps
Start by fixing the values for evidence variables.
Sample the non-evidence variables using conditional probabilities in the Bayesian network.
Weight each sample by its likelihood: the probability of all the evidence occurring.
an assumption that the current state depends on only a finite fixed number of previous states.
Markov Assumption
a sequence of random variables where the distribution of each variable follows the Markov assumption.
Markov Chain
a type of a Markov model for a system with hidden states that generate some observed event
Hidden Markov Model
tasks that can be achieved using Markov models
- Filtering
- Prediction
- Smoothing
- Most likely explanation