gallery of continuous variables Flashcards
Parametrized distributions
When we studied discrete random variables we learned, for example, about the Bernoulli(π) distribution. The probability π used to define the distribution is called a parameter and Bernoulli(π) is called a parametrized distribution.
what distribution do tosses of a fair coin follow?
tosses of fair coin follow a Bernoulli distribution where the parameter π = 0.5.
what is a key question of statistics?
estimate the parameters of a distribution
give an example of estimating the parameters of a distribution
if I have a coin that may or may not be fair then I know it follows a Bernoulli(π) distribution, but I donβt know the value of the parameter π.
I might run experiments and use the data to
estimate the value of π.
As another example, the binomial distribution Binomial(π, π) depends on two parameters π and π.
Uniform Distribution
Parameters, Range, Notation, PDF, CDF, Models
- Parameters: π, π.
- Range: [π, π].
- Notation: uniform(π, π) or U(π, π).
uniform distribution graphs
Example 1. 1. Suppose we have a tape measure with markings at each millimeter. If we
measure (to the nearest marking) the length of items that are roughly a meter long, the
rounding error will be uniformly distributed between -0.5 and 0.5 millimeters.
Many board games use spinning arrows (spinners) to introduce randomness. When spun, the arrow stops at an angle that is uniformly distributed between 0 and 2π radians.
- In most pseudo-random number generators, the basic generator simulates a uniform distribution and all other distributions are constructed by transforming the basic generator.
exponential distribution
- Parameter: π.
- Range: [0, β).
- Notation: exponential(π) or exp(π).
taxi waiting time
If I step out to 77 Mass Ave after class and wait for the next taxi, my waiting time in minutes is exponentially distributed. We will see that in this case π is given by 1/(average number of taxis that pass per minute).
waiting time for unstable isotope to undergo nuclear decay
The exponential distribution models the waiting time until an unstable isotope
undergoes nuclear decay. In this case, the value of π is related to the half-life of the isotope
true or false: exponential distribution is the only one that models waiting times
False.
there are other distributions that also model waiting times, but the exponential distribution has the property that it is memoryless.
Example of memoryless wait time
Suppose that the probability that a taxi arrives within
the first five minutes is π. If I wait five minutes and, in this case, no taxi arrives, then the
probability that a taxi arrives within the next five minutes is still π. That is, my previous
wait of 5 minutes has no impact on the length of my future wait!
Example of wait time with memory
Suppose I were to instead go to Kendall Square subway station and wait for the next inbound train. Since the trains are coordinated to follow a schedule (e.g., roughly 12 minutes between trains), if I wait five minutes without seeing a train then there is a far greater probability that a train will arrive in the next five minutes.
In particular, waiting
time for the subway is not memoryless, and a better model would be the uniform distribution on the range [0,12].
what is the memorylessness of the exponential distribution analagous to?
The memorylessness of the exponential distribution is analogous to the memorylessness
of the (discrete) geometric distribution, where having flipped 5 tails in a row gives no information about the next 5 flips. Indeed, the exponential distribution is precisely the continuous counterpart of the geometric distribution, which models the waiting time for a discrete process to change state. More formally, memoryless means that the probability of waiting π‘ more minutes is independent of the amount of time already waited.
In symbols,
π (π > π + π‘ | π > π ) = π(π > π‘).
proof of memorylessness
We know that
(π > π + π‘) β© (π > π ) = (π > π + π‘),
since the event βwaited at least π minutesβ contains the event βwaited at least π + π‘ minutesβ. Therefore the formula for conditional probability gives