Part 4. Common Probability Distributions Flashcards
Probability distribution
Specifies the probabilities associated with the possible outcomes of a random variable.
Random variable
A quantity whose future outcomes are uncertain.
Types of random variables:
- Discrete random variable - take on at most a countable (possible infinite) number of possible values.
e. g. a discrete random variable X can take on a limited number of outcomes x1, x2,….,xn (n possible outcomes), or a discrete random variable Y can take on an unlimited number of outcomes y1, y2,… (without end).
Countable & finite - the number of yes votes at a corporate board meeting.
Countable & infinite - no. of trades by the market participants.
X - random variable
x - outcome of X
Continuous random variable (Z)
We cannot count the outcomes, so cannot describe possible outcomes of Z with a list z1, z2,….as the outcome (z1 + z2)/2 is not on the list, would always be possible.
e.g. volume of water in a glass, number of central bank board members voting for rate hike.
Probability Function
Specifies the probability that the random variable takes on a specific value: P(X=x) is the probability that the random variable X takes on value x.
For discrete random variable (prob. mass function) is p(X) = P(X=x).
For continuous random variables (prob. density function) denoted as f(x).
Cumulative distribution function
Gives the probability that a random variable X is less than or equal to a particular value x, P(X less than equal to x).
For both discrete and continuous random variables: F(x) = P(X less than equal to x).
Parallel to cumulative relative frequency.
Discrete uniform distribution
The distribution has a finite number of specified outcomes, and each outcome is equally likely.
CDF characteristics:
- the cdf lies between 0 and 1 for any x: 0 to F(x) to 1
- as x increases, the cdf either increases or remains constant.
Continuous uniform distribution
An appropriate probability model to represent a particular kind of uncertainty in beliefs in which all outcomes appear equally likely.
Binomial distribution
Used when we make probability statements about a record of success and failures/anything with binary outcomes.
Built from the Bernoulli random variable.
- Symmetric when the probability of success on trial is 0.5, but asymmetric/skewed otherwise.
Bernoulli trial
If we let Y = 1 when the outcome is successful, and Y = 0 when the outcome is failure, then the probability function of Bernoulli random variable Y is:
p(1) = P(Y=1) = p p(0) = P(Y=0) = 1-p
where p is the probability that the trial is a success.
In n Bernoulli trials, we can have 0 to n successes, and if outcome of individual trial is random, the total number of successes in n trials is also random.
Binomial distribution assumptions:
- The probability p of success is constant for all trials
- The trials are independent.
ie: X - B(n, p)
Up transitory probability
The stock moves up with constant probability p, u is 1 plus the rate of return for an up move.
Down transitory probability
The stock moves down with constant probability 1-p, if it moves down, d is 1 plus the rate of return for a down move.
Binomial tree
We now associate each of the n=4 stock price moves with time indexed by t.
Node
Each boxed value from which successive moves or outcomes branch out in the tree.
The initial node at t=0, shows the beginning stock price S, subsequent nodes represent a potential value for stock price at a specified future time.
Final stock price distribution
A function of the initial stock price, the number of up moves, and the size of up and down moves.
The stock price is a function of a binomial random variable, as well as of u and d, and initial price S.
The standard formula describes a process in which stock volatility is constant over time, but binomial trees can be used to model changing volatility overtime.
Central Limit Theorem
States that the sum (and mean) of a large number of independent random variables (with finite variance) is approximately normally distributed.
Univariate distribution
A single random variable