2. Random Variables Flashcards
Discrete vs Continuous Random Variables
- discrete random variables can only take on a finite or at most countably infinite number of values
- continuous random variables can take on a continuum of values
Probability of an Event
-for a discrete random variable X and a real value a, the event X=a is the set of outcomes in Ω for which the random variable assumes the value a
-i.e. X=a ≡ {ω∈Ω | X(ω)=a}
-the probability of this event is:
Pr(X=a) = Σ Pr(ω)
-where the sum is taken over ω∈Ω such that X(ω)=a
Probability Mass Function
-frequency or probability mass function (PMF) of a discrete random variable X gives all the probabilities for the different possible values of X:
px(x) = Pr(X=x)
-it must satisfy the property:
Σ px(x) = Σ Pr(X=x) = 1
Cumulative Distribution Function
-a cumulative distribution function (CDF) is a non-decreasing function F defined as: F(x) = P(X≤x) -and satisfies: lim x->-∞ F(x) = 0 and: lim x->+∞ F(x) = 1
Independence of Random Variables
Definition
-two random variables X and Y are independent if every event expressible in terms of X alone is independent of every other event expressible in terms of Y alone, in particular:
P(X≤x & Y≤y) = P(X≤x)P(Y≤y)
Independence of Discrete Random Variables
Definition
-for two independent discrete random variables X[Y] taking on values xi[yi], then:
P(X=xi & Y=yi) = P(X=xi)P(Y=yi)
Bernoulli Random Variables
-take on only two values, 1 or 0
p(x) = p, if x=1 1-p if x=0, 0, else
-a useful representation is:
p(x) = p^x (1-p)^(1-x) if x=1 or x=0, 0 else
What are bernoulli random variables used for?
-let A⊂Ω be an event in the sample space Ω, let:
X(ω) = {1, ω∈A 0, else}
-then X is an indicator random variable that takes on value 1 if A does occur and 0 otherwise
-bernoulli random variables often represent success vs. failure of an experiment
Binomial Distribution
-experiment performed n times
-each trial is independent of the others
-assume each experiment results in success with probability p (i.e. each is described by a Bernoulli random variable)
-the random variable X=ΣYj denoting the number of successes in the n independent Bernoulli trials has a binomial distribution
p(k) = P(X=k) = nCk p^k (1-p)^(n-k)
Binomial Theorem
(X + Y)^m = Σ mCk X^k Y^(m-k)
1 = 1^m = (p + (1-p))^m
= Σ mCk p^k (1-p)^(m-k)
Geometric Distribution
-sequence of independent Bernoulli trials performed, no upper bound
-random variable X denoting the number of trials that must be performed before a success occurs has a geometric distribution:
p(k) = P(X=k) = p(1-p)^(k-1)
Negative Binomial Distribution
-same assumptions as the geometric distribution, a sequence of bernoulli trials with no upper bound
-the random variable X denoting the number of trials required until the rth success has a negative binomial distribution
-event X=k happens when in the first k-1 trials there were exactly r-1 succeses and on the kth trial, there was also a success:
p(k) = P(X=k)
= (k-1)C(r-1) p^r (1-p)^(k-r)
Poisson Distribution
-can be derived as the limit of the binomial distribution
-consider a binomial distribution with large n and small p
-let λ=np
-let n->∞ and p->0 such that λ remains constant
P(X=k) = λ^k e^(-λ) / k!
Expectation of a Discrete Random Variable
E(X) = Σ k*px(k) = Σ kP(X=k)
Linearity of Expectation
-the expectation of the sum of random variables is equal to the sum of their expectations:
E(Σ Xi) = Σ (E(Xi))