Week 8 (Chapter 13) - Uncertainty Flashcards
What are some methods for handling uncertainty?
- Nonmonotonic logic
- Rules with confidence factors
- Probability
Ultimately, just probability because it is able to summarize the effects of laziness and ignorance.
- laziness: failure to enumerate exceptions, qualifications, etc.
- ignorance: lack of relevant facts, initial conditions, etc.
What does Fuzzy logic handle?
Degree of truth
e.g. WetGrass is true to degree 0.2
What are the issues with Rules with confidence factors?
Problems with combination
e.g.
Sprinkler -> 0.99 WetGrass
WetGrass -> 0.7 Rain
Problems with combination, e.g. Does Sprinkler cause Rain?
What are the issues with Nonmonotonic logic?
Hard to determine which assumptions are reasonable
What is “Utility Theory”? [Follow-up]
Used to represent and infer preferences
What is “Decision Theory”? [Follow-up]
Decision theory = Utility theory + Probability theory
WTF is “random variables”?
A “random variable” is a function from sample points to some range
Consider a 6 sided dice
e.g. P(Odd=true) = 1/6 + 1/6 + 1/6 = 1/2
What is an “event” in probability?
A set of sample points
e.g. P(A) = sumof{w belonging to A} P(w)
e.g. P(die roll < 4) = 1/6 + 1/6 + 1/6 = 1/2
WTF is a “proposition”? [Follow-up]
Page 9 Lecture 8
Think of a proposition as the event where the proposition is true
P(A U B) = ?
P(A) + P(B) - P(A n B)
Syntax for Propositions?
- Propositional or Boolean random variables
e. g. Cavity (do I have a cavity?) - Discrete random variables
e.g. Weather is one of
Weather = rain is a proposition - Continuous random variables
e. g. Temp = 21.6 or Temp < 22.0
How to find the probability for continuous variables?
Gaussian probability density function
P(x) = (1/sqrt(2pi) * std dev) * e^( (x - mean)^2 / 2 * std dev^2) )
Is new evidence good for conditional probability?
No, may be irrelevant
e.g. P(cavity | toothache, carltonWins) = P(cavity | toothache) = 0.8
The new evidence is valid, but not always useful. It’s less specific.
P(A | B) = ?
P(A n B) / P(B)
Note: P(A n B) = P(A | B)P(B) = P(B | A)P(A)
Proof: P(A | B) = ( P(A | B)P(B) ) / P(B) -> P(A | B) = P(A | B)
therefore:
Bayes Rule = P(A | B) = ( P(B | A) / P(B) ) / P(A)
Inference by enumeration?
It’s just the distribution matrix.
Everything should add up to 1
Page 17 - 20 Lecture 8
Follow-up on Page 21 - 22 Lecture 8, wtf is this shit
wtf is the normalization constant
!?!?!?!
P(cavity | toothache) = a*P(cavity, toothache)
Conditions for A and B to be independent?
A and B are independent iff
P(A|B) = P(A) or
P(B|A) = P(B) or
P(A,B) = P(A)P(B)
What is conditional independence useful for?
Elaborate on conditional independence with explanation with toothache, cavity example
Conditional independence is useful for simplifying probability calculations.
If I have a cavity, the probability that the probe catches in it doesn’t depend on whether I have a toothache:
(1) P(catch|toothache,cavity) = P(catch|cavity)
The same independence holds if I haven’t got a cavity:
(2) P(catch|toothache,¬cavity) = P(catch|¬cavity)
Catch is conditionally independent of Toothache given Cavity:
P(Catch|Toothache,Cavity) = P(Catch|Cavity)
How to write out joint distribution using chain rule of P(Toothache, Catch, Cavity)?
[Follow-up]
Page 25 Lecture 8
P(Toothache,Catch,Cavity)
= P(Toothache|Catch,Cavity)P(Catch,Cavity)
= P(Toothache|Catch,Cavity)P(Catch|Cavity)P(Cavity)
= P(Toothache|Cavity)P(Catch|Cavity)P(Cavity)
Bayes Rule?
P(A|B) = (P(B|A) / P(B)) * P(A)
or in distribution form
P(Y|X) = a * P(X|Y)P(Y)
a = normalization constant, whatever the fuck that is… [Follow-up]
Bayes Rule and conditional independence? Is it possible?
Yes, this is basically the Naïve Bayes model:
P(a) * multiplicationof( P(b|a) )
Remember, Naïve Bayes works on the assumption that all probabilities are conditionally independent.
Otherwise it’ll just shit the bed because some probabilities are correlated with others etc.
P(Cavity|toothache∧catch)
= αP(toothache∧catch|Cavity)P(Cavity)
= αP(toothache|Cavity)P(catch|Cavity)P(Cavity)