Exam 2 Flashcards
Assigning Probabilities: Classical Approach
Assigning probabilities based on the assumption of equally likely outcomes
Assigning Probabilities: Relative Frequency Approach
Assigning probabilities based on experimentation or historical data
Assigning Probabilities: Subjective Approach
Assigning probabilities based on judgment
Classical Approach example
If an experiment has n possible outcomes, this method would assign a probability of 1/n to each outcome. It is necessary to determine the number of possible outcomes. Experiment: Rolling a die Outcomes {1, 2, 3, 4, 5, 6} Probabilities: Each sample point has a 1/6 chance of occurring
Relative Frequency Method
Each probability assignment is given by dividing the frequency (number of days) by the total frequency (total number of days).

Subjective Approach
“In the subjective approach we define probability as the degree of belief that we hold in the occurrence of an event”
Example: Weather forecasting
it’s a subjective probability based on past observations combined with current weather conditions
– based on current conditions, there is a 60% chance of rain (say)
Compliment of an Event
The complement of event A is defined to be the event consisting of all sample points that are “not in A”.
Complement of A is denoted by Ac
The Venn diagram below illustrates the concept of a complement.
P(A) + P(Ac ) = 1
Intersection of events
The intersection of events A and B is the set of all sample points that are in both A and B.

The intersection is denoted: A and B A ∩ Β
The joint probability of
A and B is the probability of
the intersection of A and B
P(A and B)
Union of two events
The union of two events A and B, is the event containing all sample points that are in A or B or both:

Union of A and B is denoted: A or B (A ∪ B)
Mutually Exclusive Events
When two events are mutually exclusive (that is the two events cannot occur together), their joint probability is 0, hence:
Mutually exclusive: no points in common
For example A = tosses totaling 7 and B = tosses totaling 11

Condtional Probability
Conditional probability is used to determine how two events are related; that is, we can determine the probability of one event given the occurrence of another related event.

Conditional probabilities are written as P(A | B) and read as “the probability of A given B” and is calculated as:
Independence
If both events can occur they are dependent.
One of the objectives of calculating conditional probability is to determine whether two events are related.
In particular, we would like to know whether they are independent, that is, if the probability of one event is not affected by the occurrence of the other event.
Two events A and B are said to be independent if
P(A|B) = P(A)
or
P(B|A) = P(B)
Three probability rules and trees
3 rules that enable us to calculate the probability of more complex events from the probability of simpler events:
The Complement Rule
The Multiplication Rule
The Addition Rule
Probability Rule: Complement Rule
The complement rule gives us the probability of an event NOT occurring
P(AC) = 1 – P(A)
Example:
In the simple roll of a die, the probability of the number “1” being rolled is 1/6
The probability that some number other than “1” will be rolled is 1 – 1/6 = 5/6.
Probability Rule: Multiplication Rule
The multiplication rule is used to calculate the joint probability of two events. It is based on the formula for conditional probability defined earlier:
If we multiply both sides of the equation by P(B) we have:
P(A and B) = P(A | B)•P(B)
Likewise, P(A and B) = P(B | A) • P(A)
If A and B are independent events, then P(A and B) = P(A)•P(B)

Probability Rule: Addition Rule
Recall: the addition rule is used to compute the probability of event A or B or both A and B occurring; i.e. the union of A and B.
Random Variable
A random variable is a function or rule that assigns a number to each outcome of an experiment.
Discrete Random Variable
Discrete Random Variable
– one that takes on a countable number of values
– E.g. values on the roll of dice: 2, 3, 4, …, 12
Continuous Random Variable
Continuous Random Variable
– one whose values are not discrete, not countable
– E.g. time (30.1 minutes? 30.10000001 minutes?)
Binomial Distribution
The binomial distribution is the probability distribution that
results from doing a “binomial experiment”.
Binomial experiments have the following properties:
Fixed number of trials, represented as n.
Each trial has two possible outcomes, a “success” and a “failure”.
P(success)=p (and thus: P(failure)=(1–p), for all trials.
The trials are independent, which means that the outcome of one trial does not affect the outcomes of any other trials.
Conditions of Binomial Expierement:
There is a fixed finite number of trials (n=10).
An answer can be either correct or incorrect.
The probability of a correct answer
(P(success)=.20) does not change from
question to question.
Each answer is independent of the others.
Poisson Random Variable
The Poisson random variable is the number of successes that occur in a period of time or an interval of space in a Poisson experiment.
E.g. On average, 96 (successes) trucks arrive at a border crossing every hour (time period).
E.g. The number of typographic errors in a new textbook edition averages 1.5 per 100 pages.
Binomial Distribution Example
The quiz consists of 10 multiple-choice questions. Each question has five possible answers, only one of which is correct. Pat plans to guess the answer to each question.
Algebraically then: n=10, and P(success) = 1/5 = .20
Poisson Distribution
A statistics instructor has observed that the number of typographical errors in new editions of textbooks varies considerably from book to book. After some analysis he concludes that the number of errors is Poisson distributed with a mean of 1.5 per 100 pages. The instructor randomly selects 100 pages of a new book. What is the probability that there are no typos?











