3. Joint Distributions Flashcards
What are joint probability distributions for?
- in many situations there is more than just one quantity associated with the experiment
- it is interesting to study the joint behaviour of these random quantities
Joint Probability Distribution
Cumulative Distribution Function
-the joint behaviour of two random variables X and Y is determined by the cumulative distribution function:
F(X,Y) = P(X≤x, Y≤y)
-regardless of whether X and Y are continuous or discrete
Joint Probability Distribution for Discrete Random Variables
-the joint probability mass function of two discrete random variables is given by:
pX,Y(x,y) = P(X=x, Y=y)
Conditional Probability Mass Function for Discrete Random Variables
-let X and Y be two discrete random variables, then the conditional probability mass function that X=xi given that Y=yj is defined as:
P(X=xi, Y=yj) = P(X=xi,Y=yj)/P(Y=yj)
= pX,Y(xi,yj) / pY(yj)
Join Probability Density Function for Continuous Random Variables
- suppose we have two continuous random variables X and Y
- their joint density function is a piecewise continuous function of two variables f(x,y) which has the properties:
i) fX,Y(x,y) ≥ 0
ii) ∫∫ f(x,y)dxdy = 1 - where integrals are both between -∞ and +∞
iii) ∀A⊂ℝ², P((X,Y)∈A) = ∫∫f(x,y)dxdy - where the integral is over the region A
Joint Cumulative Distribution Function for Continuous Random Variables
-it follows from the definition of the joint probability density function for continuous random variables that the joint probability distribution function is given by:
F(x,y) = P(X≤x,Y≤y) = ∫∫f(x,y)dxdy
-where the integral are from -∞ and x AND -∞ and y
Marginal PDF
-if X and Y are continuous random variables, with joint pdf fX,Y then the individual or marginal pdf's fX and fY are given by: fX(x) = ∫f(x,y) dy -integral between -∞ and +∞ fY(y) = ∫f(x,y) dx -integral between -∞ and +∞
Joint Probability Distribution
Conditional Probability Density Function
-let X and Y be continuous random variables with joint pdf fX,Y and marginal pdf’s fX and fY
-then for any x such that fX(x)>0, we define the conditional pdf of Y given X=x as:
fY|X=x(y) = fX,Y(x,y) / fX(x)
Joint Probability Distribution
Independence for Continuous Random Variables
-let X and Y be two continuous random variables
-we say that X and Y are independent if:
fX,Y(x,y) = fX(x) fY(y)
-∀x,y⊂ℝ
-this is equivalent to saying:
fY|X=x(y) = fY(y)
-∀x such that fX(x)>0 and ∀y∈ℝ
Functions of Jointly Distributed Random Variables
Convolution Discrete Case
-assume X and Y are discrete random variables taking integer values with joint pmf p(x,y)
-let Z=X+Y
-note that Z=z whenever X=x and Y=z-x
-if follows that:
P(Z=z) = pZ(z) = Σ p(x,z-x)
-summed from -∞ to +∞
-if X and Y are independent:
P(Z=z) = pZ(z) = ΣpX(x)pY(z-x)
-summed from -∞ to +∞
-this sum is called the convolution of the sequences pX and pY
Functions of Jointly Distributed Random Variables
Continuous Case
-assume that X and Y are continuous random variables
-let Z = X+Y
-compute the cdf F(Z)
FZ(z) = ∫∫ f(x,y)dxdy
-where the integral is over (x,y) : x+y≤z
-which is equivalent to integral limits -∞≤x≤+∞ and -∞≤y≤z-x
-making a change of variable from y to v=x+y and reverse the order of integration
-differentiate to find the density:
fZ(z) = ∫ f(x,z-x)dx
-integral from -∞ to +∞
-IF X and Y are independent the result is a convolution:
fZ(z) = ∫ fX(x)fY(z-x)dx
-integral from -∞ to +∞
Bivariate Normal Distribution
Simple Case
-X1 and X2 independent with distribution N(0,σ²) have joint pdf:
f(x1,x2) = 1/[2πσ²] exp(-[X1²+X2²]/2σ²)
Bivariate Normal Distribution
More General
-suppose X1,X2 ~ N(0,Σ) where Σ is a two by two matrix with entries, σ², ρσ², ρσ², σ²
-the joint pdf is:
f(x1,x2) = 1/[2π|Σ|^(1/2)] exp(-1/2 X^T Σ^(-1) x)
= 1/[2πσ²√(1-ρ²)] exp{-[X1²+X2²-2ρX1X2]/[2σ²(1-ρ)²]}
-where ρ is the correlation between X1 and X2
Bivariate Normal Distribution
Most General
-assume X_~N(μ_,Σ)
-μ_ is a 2x1 vector of matrices μ1 and μ2
-Σ is a 2x2 matrix with entries σ1², ρσ1σ2, ρσ1σ2, σ2²
f(x1,x2) = 1/[2πσ1σ2] exp{-1/[2(1-ρ)²] (X1-μ1)²/σ1² - 2(X1-μ1)(X2-μ2)/σ1σ2 + (X2-μ2)²/σ2²}