Week 8 (Chapter 13) - Uncertainty Flashcards
What are some methods for handling uncertainty?
- Nonmonotonic logic
- Rules with confidence factors
- Probability
Ultimately, just probability because it is able to summarize the effects of laziness and ignorance.
- laziness: failure to enumerate exceptions, qualifications, etc.
- ignorance: lack of relevant facts, initial conditions, etc.
What does Fuzzy logic handle?
Degree of truth
e.g. WetGrass is true to degree 0.2
What are the issues with Rules with confidence factors?
Problems with combination
e.g.
Sprinkler -> 0.99 WetGrass
WetGrass -> 0.7 Rain
Problems with combination, e.g. Does Sprinkler cause Rain?
What are the issues with Nonmonotonic logic?
Hard to determine which assumptions are reasonable
What is “Utility Theory”? [Follow-up]
Used to represent and infer preferences
What is “Decision Theory”? [Follow-up]
Decision theory = Utility theory + Probability theory
WTF is “random variables”?
A “random variable” is a function from sample points to some range
Consider a 6 sided dice
e.g. P(Odd=true) = 1/6 + 1/6 + 1/6 = 1/2
What is an “event” in probability?
A set of sample points
e.g. P(A) = sumof{w belonging to A} P(w)
e.g. P(die roll < 4) = 1/6 + 1/6 + 1/6 = 1/2
WTF is a “proposition”? [Follow-up]
Page 9 Lecture 8
Think of a proposition as the event where the proposition is true
P(A U B) = ?
P(A) + P(B) - P(A n B)
Syntax for Propositions?
- Propositional or Boolean random variables
e. g. Cavity (do I have a cavity?) - Discrete random variables
e.g. Weather is one of
Weather = rain is a proposition - Continuous random variables
e. g. Temp = 21.6 or Temp < 22.0
How to find the probability for continuous variables?
Gaussian probability density function
P(x) = (1/sqrt(2pi) * std dev) * e^( (x - mean)^2 / 2 * std dev^2) )
Is new evidence good for conditional probability?
No, may be irrelevant
e.g. P(cavity | toothache, carltonWins) = P(cavity | toothache) = 0.8
The new evidence is valid, but not always useful. It’s less specific.
P(A | B) = ?
P(A n B) / P(B)
Note: P(A n B) = P(A | B)P(B) = P(B | A)P(A)
Proof: P(A | B) = ( P(A | B)P(B) ) / P(B) -> P(A | B) = P(A | B)
therefore:
Bayes Rule = P(A | B) = ( P(B | A) / P(B) ) / P(A)
Inference by enumeration?
It’s just the distribution matrix.
Everything should add up to 1
Page 17 - 20 Lecture 8