Probability Theory Flashcards
To learn probability theory
What is the first axiom of probability measures?
P(W) = 1 and P(Ø) = 0
What is the second axiom of probability measures?
If A ∩ B = ø then P(A ∪ B) = P(A) + P(B)
Define the property of General Additivity
P(A ∪ B) = P(A) + P(B) - P(A ∩ B)
Denote the uncertainty measure
µ = P
Define the complement
P(A^∁) = 1 - P(A)
How many values does an agent need to specify to define an uncertainty measure?
2^n - 2
How many values does an agent need to specify to define a probability distribution?
n - 1
For A, B ⊆ W, how is the conditional probability of A given B defined?
P(A|B) = P(A ∩ B) / P(B)
Define Bayes Theorem
P(A|B) = P(B|A)P(A) / P(B)
Define the theorem of total probaility
P(A) = P(A|B)P(B) + P(A|B^c)P(B^c)
What is a prior in the context of probability theory?
A defined probability measure
What is Laplace’s principle of insufficient reason?
In the absence of any other information all possible worlds should be assumed to be equally probable
What two justifications are there for choosing probability to measure uncertainty?
Cox’s Justification and Willingness to bet (de Finetti, Ramsay, Kemeny)
Define Cox’s first axiom
The agent defines a conditional measure
Define Cox’s second axiom
If A ≠ Ø then
Define Cox’s third axiom
If B ≠ Ø
then
Define Cox’s fourth axiom
If B∩C ≠ Ø
then
Define Cox’s theorem
If Cox1,…,Cox 4 hold
then there is a continuous, strictly increasing surjective function g: [0,1] → [0,1]
such that g(
Define the Betting Justification membership function XA of set A ⊆ W
XA : W -> {0,1}
Define Bet1 of the Betting Justification
Gain S(1-p) if A is true and lose Sp if A is false
Define Bet2 of the Betting Justification
Lose S(1-p) if A is true and gain Sp if A is false
Under the Betting Justification what is the agents gain if they pick Bet 1?
S(1-p)XA(w) - Sp(1-XA(w)) = S(XA(w*)-p)
Under the Betting Justification what is the agents gain if they pick Bet 2?
Sp(1-XA(w)) - S(1-p)XA(w) = -S(XA(w*)-p)
How many values must be defined to specify a joint probability distribution for n binary variables?
2n - 1
Defines the probability distribution for P(X1 = x1|X2 = x2) assuming X1 and X2 are dependent
P(X1=x1, X2=x2) / P(X2=x2)
Define the probability distribution for P(X1 = x1 | X2= x2) assuming X1 is independent of X2
P(X1=x1)
How many values must be defined to specify a joint probability distribution for n binary variables?
n
What is the implicit assumption when Conditional Independence is used?
Some information is irrelevant