Random Variables Flashcards
Discrete Random Variable
X with values x = 1,2,…
EX = sum (x_i)( P(X = x_i) )
Var(X) = E(X^2) - (EX)^2
Uniform Random Variable
EX = (x_1 + … + x_n) / n
Var(X) = (x_1^2 + … + x_n^2) / n - (EX)^2
Bernoulli Random Variable
aka indicator random variable represented by an event A with probability p
I_A = 1 if A occurs or 0 otherwise
E(I_A) = p Var(I_A) = p - p^2 = p ( 1 - p )
Binomial Random Variable
Binomial(n,p)
– number of successes in n independent trials
pmf = P(X = i) = nCi p^i (1-p)^n-i
EX = np
Var(X) = np - np^2 = np(1-p)
Poisson Random Variable
Poisson(lambda)
pmf = P(X = i) = lam^i / i! e^ - lam
EX = lambda
Var(X) = lambda
Poisson Approximation of binomial
when n is large and p is small and
lambda = np is of moderate size
P(X = i) –> lam^i / i! e^(-lambda)
as n–> infinity
Geometric Random Variable
Geometric(p) counts the number of independent trials before the first success
pmf = P(X = n) = p(1-p)^n-1
EX = 1/p
Var(X) = 1-p / p^2
P(X > n) = (1-p)^n
P(X > n + k given X > k) = P(X > n)
Continuous Random Variables…
no longer discuss pmf like we did with discrete random vars
now use Distribution Functions given by
F = F_X = P(X < x) = \int_(-infinity,x] f(s) ds
so, F’(x) = f(x)
where f(s) is the density function of X and F is the distribution function
int _ (-inf,inf) f(x) dx = 1
EX = int (-inf,inf) x f(x) dx
Var(X) = E(X^2) - (EX) ^2
Continuous Uniform Random Variable
density:
f(x) = 1 / b-a if x \in [a,b]
0 otherwise
EX = a+b / 2
Var(X) = (b-a)^2 / 12
Continuous Exponential Random Variable
Exponential(lambda)
used for waiting time for an event to occur
density: f(x) = lambda e^(- lambda x) for x >= 0
0 otherwise
EX = 1 / lambda
Var(X) = 1 / lambda^2
P(X >= x) = e^(- lambda x)
Memoryless property:
P(X >= x + y given X >= y) = e^(- lambda x)
Normal Random Variable
Normal(mu, sig^2)
density: f(x) = ( 1 / sig\sqrt(2pi) ) e^( (x-mu)^2 / 2sig^2 )
EX = mu
Var(X) = sig^2
normal approximation for binomial
binomial is approximately normal when p is fixed and n is large
then,
P( S_n - np / sqrt(np(1-p)) < x) –> PHI(x)
as n–> infinity
Joint pmf for discrete random variables
X,Y are discrete random vars
joint pmf = P(X=x,Y=y)
marginal probability:
P(X = x) = sum_(all y) P(X=x,Y=y)
P(Y = y) = sum_(all x) P(X=x,Y=y)
Independence of random variables
discrete case:
independent if the joint pmf is the product of the marginal pmfs
P(X = x, Y = y) = P(X = x) P(Y = y)
continuous case:
independent if the joint density is the product of the marginal densities
f(x,y) = f_X(x) f_Y(y)
joint density functions for continuous random variables
two continuous vars X, Y
let f(x,y) be the joint density function
the marginal density functions are given below:
f_X(x) = int_(-inf,inf) f(x,y) dy
f_Y(y) = int_(-inf,inf) f(x,y) dx