Chapter 4 - Introduction to Probability p.173 Flashcards
Addition law p.191
A probability law used to compute the probability of the union of two events. It is
Probability(Union(A,B)) = Probability(A) + Probability(B) - Probability(Intersect(A,B))
For mutually exclusive events, Probability(Intersect(A,B)) = 0; in this case the addition law reduces to Probability(Union(A,B)) = Probability(A) + Probability(B)
Basic requirements for assigning probabilities p.180
§ Two requirements that restrict the manner in which the probability assignments can be made:
1. For each experimental outcome E we must have 0 <= P€ <= 1; 2. Considering all experimental outcomes, we must have P(E1) + P(E2) + . . . + P(En) = 1.0
Bayes’ theorem p.204
A method used to compute posterior probabilities.
Classical method p.180
A method of assigning probabilities that is appropriate when all the experimental outcomes are equally likely.
Combination p.179
In an experiment we may be interest in determining the number of ways n objects may be selected from among N objects without regard to the order in which the n objects are selected. Each selection of n objects is called a combination and the total number of combinations of N objects taken n at a time is
Combination(N,n) = (N . n) = N!/(n!(N - n)!) for n = 0,1,2,…,N.
Complement of A p.189
The event consisting of all sample points that are not in A
Conditional probability p.196
The probability of an event given that another event already occurred. The conditional probability of a given B is
P(A | B) = P(Intersect(A,B))/P(B)
Event
A collection of sample points.
Experiment
A process that generates well-defined outcomes.
Independent events p.199
Two events A and B where P(A|B) = P(A) or P(B|A) = P(B); that is, the events have no influence on each other.
Intersection of A and B p.191
The event containing the sample points belonging to both A and B.
The intersection is denoted Intersect(A,B)
Joint probability p.197
The probability of two events both occurring; that is, the probability of the intersection of two events.
Marginal probability p.197
The values in the margins of a joint probability table that provide the probabilities of each event separately.
Multiple-step experiment p.176
An experiment that can be described as a sequence of steps. If a multiple-step experiment has k steps with n1 possible outcomes on the first step, n2 possible outcomes on the second step, and so on, the total number of experiment outcomes is given by (n1)(n2)…(nk)
Multiplication law p.199
A probability law used to compute the probability of the intersection of two events. It is
P(Intersect(A,B)) = P(B)P(A|B), or
P(Intersect(A,B)) = P(A)P(B|A).
For independent events it reduces to
P(Intersect(A,B)) = 0.
Mutually exclusive events p.193
Events that have no sample points in common; that is, Intersect(A,B) is empty and Probability(Intersect(A,B)) = 0.
Permutation p.179
In an experiment we may be interested in determining the number of ways n objects may be selected from among N objects when the order in which the n objects are selected is important. Each ordering of n objects is called a permutation and the total number of permutations of N objects taken n at a time is
Permutations(N,n) = n!(N . n) = N!/(N - n)! For n = 0, 1, 2, …, N.
Posterior probabilities p.205
Revised probabilities of events based on additional information.