BA Chapter 5a (pages 126-142) Flashcards
Probability, Experiment, and Outcome
Probability is the likelihood that an outcome occurs.
An experiment is the process that results in an outcome.
The outcome of an experiment is a result that we observe.
Empirical Probability Distribution (also known as relative frequency)
Use sample data to approximate a probability distribution.
Bernoulli Distribution
The Bernoulli distribution is a discrete distribution having two possible outcomes labelled by n = 0 and n = 1 in which n = 1 (“success”) occurs with probability p and n = 0 (“failure”) occurs with probability q = 1 - p.
Binomial Distribution
n independent replications of a Bernoulli trial with
- Only two possible outcomes: “success” = p; “failure” = (1 - p) = q
- Constant probabilities on each trial
- Fixed number of trials (n)
- Independent trials
and x = number of successes
Complement of an Event
Complement = Opposite
Ac, is the complement (opposite) of A. It consists of all outcomes in the sample space that are not in A. Ac is also written as A′.
Ac = {not A}
P(Ac) = 1 − P(A)
Marginal, Joint, & Conditional Probability
-
Marginal probability – probability associated with a single outcome of a random variable (a simple, single event).
- P(A)
-
Joint probability – probability of outcomes of 2 or more random variables.
- P(A and B)
-
Conditional probability – the probability of the occurrence of A, given that B has already occurred.
- P (A I B) - it is the simplified form of Bayes Theorem
- A Random Variable is a numerical description of the outcome of an experiment.
- 2 Types of Random Variables: Discrete and Continuous Random Variable
A discrete random variable is one for which the number of possible outcomes can be counted.
- outcomes of dice rolls
- whether a customer likes or dislikes a product
- number of hits on a Web site link today
A continuous random variable has outcomes over one or more continuous intervals of real numbers.
- weekly change in DJIA
- daily temperature
- time between machine failures
Cumulative Distribution Function
F(x) = P(X ≤ x)
- ≤ Sums all the probabilities from x and below, including the probability for x.
- < Sums all the probabilities below x, not including the probability for x.
- Sample Space
- Event
- Independent Events
- Mutually Exclusive Events
- Sample space is the collection of all possible outcomes of an experiment. For example, roll a single dice the outcomes are 1,2,3,4,5,6.
- An event is a collection of one of outcomes from a sample space.
- Independent Events Example: Rolling pairs of dice are independent events since they do not depend on the previous rolls.
- 2 events are mutually exclusive if only one event can occur at a time and never both at the same time.
Expected Value
The mean of a discrete distribution is referred to as the Expected Value or E(X). It is a weighted average where the weights are the probabilities. The Expected Value of a discrete random variable is the probability-weighted average of all possible values.
Exponential Distribution
Exponential Distribution is the probability distribution that describes the time between randomly occurring events, i.e. a process in which events occur continuously and independently at a constant average rate.
Goodness of Fit
- The basis for fitting data to a probability distribution.
- Attempts to draw conclusions about the nature of the distribution.
- 3 statistics measure goodness of fit:
- Chi-square
- Kolmogorov-Smirnov
- Anderson-Darlin
Addition and Multiplication Rule of Probability
- Addition Rule 1
- When two events, A and B, are mutually exclusive, the probability that A or B will occur is the sum of the probability of each event:
P(A or B) = P(A) + P(B)
- Addition Rule 2
- When two events, A and B, are non-mutually exclusive, the probability that A or B will occur is:
P(A or B) = P(A) + P(B) - P(A and B)
- Multiplication Rule for Independent Events
- A method for finding the probability that both 2 events occur.
- If 2 events are independent, then:
P (A and B) = P(A)*P(B)
Normal Distribution
- f(x) is a bell-shaped curve
- Characterized by 2 parameters
- u (mean, location parameter)
- σ2 (variance, scale parameter)
- The line down the middle is the mean u
- The X-axis is measured in terms of the number of standard deviations σ from the mean u.
- Other Properties:
- Symmetric about the mean (Mean = Median = Mode)
- Unbounded (goes to infinity on both ends)
- Empirical rules apply: Empirical Rule means 68% of data fall within u ± 1σ, 95% fall within u ± 2σ, and 99.7% fall within 3σ of the mean.
Standard Normal Distribution
- Z is the standard normal random variable with:
- u = 0
- σ = 1