BA Chapter 5a (pages 126-142) Flashcards

1
Q

Probability, Experiment, and Outcome

A

Probability is the likelihood that an outcome occurs.

An experiment is the process that results in an outcome.

The outcome of an experiment is a result that we observe.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Empirical Probability Distribution (also known as relative frequency)

A

Use sample data to approximate a probability distribution.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Bernoulli Distribution

A

The Bernoulli distribution is a discrete distribution having two possible outcomes labelled by n = 0 and n = 1 in which n = 1 (“success”) occurs with probability p and n = 0 (“failure”) occurs with probability q = 1 - p.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Binomial Distribution

A

n independent replications of a Bernoulli trial with

  • Only two possible outcomes: “success” = p; “failure” = (1 - p) = q
  • Constant probabilities on each trial
  • Fixed number of trials (n)
  • Independent trials

and x = number of successes

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Complement of an Event

A

Complement = Opposite

Ac, is the complement (opposite) of A. It consists of all outcomes in the sample space that are not in A. Ac is also written as A′.

Ac = {not A}

P(Ac) = 1 − P(A)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Marginal, Joint, & Conditional Probability

A
  • Marginal probability – probability associated with a single outcome of a random variable (a simple, single event).
    • P(A)
  • Joint probability – probability of outcomes of 2 or more random variables.
    • P(A and B)
  • Conditional probability – the probability of the occurrence of A, given that B has already occurred.
    • P (A I B) - it is the simplified form of Bayes Theorem
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q
  • A Random Variable is a numerical description of the outcome of an experiment.
  • 2 Types of Random Variables: Discrete and Continuous Random Variable
A

A discrete random variable is one for which the number of possible outcomes can be counted.

  • outcomes of dice rolls
  • whether a customer likes or dislikes a product
  • number of hits on a Web site link today

A continuous random variable has outcomes over one or more continuous intervals of real numbers.

  • weekly change in DJIA
  • daily temperature
  • time between machine failures
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Cumulative Distribution Function

A

F(x) = P(X ≤ x)

  • ≤ Sums all the probabilities from x and below, including the probability for x.
  • < Sums all the probabilities below x, not including the probability for x.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q
  • Sample Space
  • Event
  • Independent Events
  • Mutually Exclusive Events
A
  • Sample space is the collection of all possible outcomes of an experiment. For example, roll a single dice the outcomes are 1,2,3,4,5,6.
  • An event is a collection of one of outcomes from a sample space.
  • Independent Events Example: Rolling pairs of dice are independent events since they do not depend on the previous rolls.
  • 2 events are mutually exclusive if only one event can occur at a time and never both at the same time.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Expected Value

A

The mean of a discrete distribution is referred to as the Expected Value or E(X). It is a weighted average where the weights are the probabilities. The Expected Value of a discrete random variable is the probability-weighted average of all possible values.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Exponential Distribution

A

Exponential Distribution is the probability distribution that describes the time between randomly occurring events, i.e. a process in which events occur continuously and independently at a constant average rate.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Goodness of Fit

A
  • The basis for fitting data to a probability distribution.
  • Attempts to draw conclusions about the nature of the distribution.
  • 3 statistics measure goodness of fit:
    • Chi-square
    • Kolmogorov-Smirnov
    • Anderson-Darlin
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Addition and Multiplication Rule of Probability

A
  • Addition Rule 1
    • When two events, A and B, are mutually exclusive, the probability that A or B will occur is the sum of the probability of each event:

P(A or B) = P(A) + P(B)

  • Addition Rule 2
    • When two events, A and B, are non-mutually exclusive, the probability that A or B will occur is:

P(A or B) = P(A) + P(B) - P(A and B)

  • Multiplication Rule for Independent Events
    • A method for finding the probability that both 2 events occur.
    • If 2 events are independent, then:

P (A and B) = P(A)*P(B)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Normal Distribution

A
  • f(x) is a bell-shaped curve
  • Characterized by 2 parameters
    • u (mean, location parameter)
    • σ2 (variance, scale parameter)
  • The line down the middle is the mean u
  • The X-axis is measured in terms of the number of standard deviations σ from the mean u.
  • Other Properties:
    • Symmetric about the mean (Mean = Median = Mode)
    • Unbounded (goes to infinity on both ends)
    • Empirical rules apply: Empirical Rule means 68% of data fall within u ± 1σ, 95% fall within u ± 2σ, and 99.7% fall within 3σ of the mean.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Standard Normal Distribution

A
  • Z is the standard normal random variable with:
    • u = 0
    • σ = 1
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Poisson Distribution

A
  • Models the number of occurrences in some unit of measure (often time or distance).
  • There is no limit on the number of occurrences and they are independent of each other.
  • This distribution can take on any non-negative integer value.
  • Poisson distribution approximates the binomial distribution when n is large and p is small.
  • E[X]= λ (lamda)
17
Q

Probability Density Function

A
  • A curve described by a mathematical function that characterizes a continuous random variable. A probability density function (PDF) is a function that describes the relative likelihood for this continuous variable to take on a given value.
  • Properties of a probability density function:
    • f(x) ≥ 0 for all values of x
    • Total area under the density function equals 1
    • P(X = x) = 0
    • Probabilities are only defined over an interval
    • P(a ≤ X ≤ b) is the area under the density function between a and b.
18
Q
  • Random Numbers
  • Random number seed
  • Random variable
A
  • Random numbers are the basis for generating random samples from probability distributions. Random numbers are uniformly distributed on the interval from 0 to 1.
  • Random number seed is a value from which a stream of random numbers is generated. By specifying the same seed, you can produce the same random numbers at a later time.
  • A random variable is a numerical description of the outcome of an experiment. Assign real numbers to the outcomes.
19
Q

Uniform Distribution

A
  • All outcomes between a minimum (a) and a maximum (b) are equally likely.
  • Useful for generating random numbers.
20
Q

Union 并集

A

The union of 2 sets is the set of elements that belong to one or both sets. Symbolically, the union of A and B is denoted by A ∪ B.

Example, A = {1, 2, 3}, B = {1, 3, 5}. Then, the union of sets A and B would be {1, 2, 3, 5}.

NOTE:

P(A or B) = UNION of events A and B.

If A and B are mutually exclusive, P(A or B) = P(A)+P(B)

If A and B are not mutually exclusive, P(A orB) = P(A)+P(B)-P(A and B)