3. Joint Distributions Flashcards

1
Q

What are joint probability distributions for?

A
  • in many situations there is more than just one quantity associated with the experiment
  • it is interesting to study the joint behaviour of these random quantities
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Joint Probability Distribution

Cumulative Distribution Function

A

-the joint behaviour of two random variables X and Y is determined by the cumulative distribution function:
F(X,Y) = P(X≤x, Y≤y)
-regardless of whether X and Y are continuous or discrete

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Joint Probability Distribution for Discrete Random Variables

A

-the joint probability mass function of two discrete random variables is given by:
pX,Y(x,y) = P(X=x, Y=y)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Conditional Probability Mass Function for Discrete Random Variables

A

-let X and Y be two discrete random variables, then the conditional probability mass function that X=xi given that Y=yj is defined as:
P(X=xi, Y=yj) = P(X=xi,Y=yj)/P(Y=yj)
= pX,Y(xi,yj) / pY(yj)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Join Probability Density Function for Continuous Random Variables

A
  • suppose we have two continuous random variables X and Y
  • their joint density function is a piecewise continuous function of two variables f(x,y) which has the properties:
    i) fX,Y(x,y) ≥ 0
    ii) ∫∫ f(x,y)dxdy = 1
  • where integrals are both between -∞ and +∞
    iii) ∀A⊂ℝ², P((X,Y)∈A) = ∫∫f(x,y)dxdy
  • where the integral is over the region A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Joint Cumulative Distribution Function for Continuous Random Variables

A

-it follows from the definition of the joint probability density function for continuous random variables that the joint probability distribution function is given by:
F(x,y) = P(X≤x,Y≤y) = ∫∫f(x,y)dxdy
-where the integral are from -∞ and x AND -∞ and y

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Marginal PDF

A
-if X and Y are continuous random variables, with joint pdf fX,Y then the individual or marginal pdf's fX and fY are given by:
fX(x) = ∫f(x,y) dy
-integral between -∞ and +∞
fY(y) = ∫f(x,y) dx
-integral between -∞ and +∞
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Joint Probability Distribution

Conditional Probability Density Function

A

-let X and Y be continuous random variables with joint pdf fX,Y and marginal pdf’s fX and fY
-then for any x such that fX(x)>0, we define the conditional pdf of Y given X=x as:
fY|X=x(y) = fX,Y(x,y) / fX(x)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Joint Probability Distribution

Independence for Continuous Random Variables

A

-let X and Y be two continuous random variables
-we say that X and Y are independent if:
fX,Y(x,y) = fX(x) fY(y)
-∀x,y⊂ℝ
-this is equivalent to saying:
fY|X=x(y) = fY(y)
-∀x such that fX(x)>0 and ∀y∈ℝ

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Functions of Jointly Distributed Random Variables

Convolution Discrete Case

A

-assume X and Y are discrete random variables taking integer values with joint pmf p(x,y)
-let Z=X+Y
-note that Z=z whenever X=x and Y=z-x
-if follows that:
P(Z=z) = pZ(z) = Σ p(x,z-x)
-summed from -∞ to +∞
-if X and Y are independent:
P(Z=z) = pZ(z) = ΣpX(x)pY(z-x)
-summed from -∞ to +∞
-this sum is called the convolution of the sequences pX and pY

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Functions of Jointly Distributed Random Variables

Continuous Case

A

-assume that X and Y are continuous random variables
-let Z = X+Y
-compute the cdf F(Z)
FZ(z) = ∫∫ f(x,y)dxdy
-where the integral is over (x,y) : x+y≤z
-which is equivalent to integral limits -∞≤x≤+∞ and -∞≤y≤z-x
-making a change of variable from y to v=x+y and reverse the order of integration
-differentiate to find the density:
fZ(z) = ∫ f(x,z-x)dx
-integral from -∞ to +∞
-IF X and Y are independent the result is a convolution:
fZ(z) = ∫ fX(x)fY(z-x)dx
-integral from -∞ to +∞

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Bivariate Normal Distribution

Simple Case

A

-X1 and X2 independent with distribution N(0,σ²) have joint pdf:
f(x1,x2) = 1/[2πσ²] exp(-[X1²+X2²]/2σ²)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Bivariate Normal Distribution

More General

A

-suppose X1,X2 ~ N(0,Σ) where Σ is a two by two matrix with entries, σ², ρσ², ρσ², σ²
-the joint pdf is:
f(x1,x2) = 1/[2π|Σ|^(1/2)] exp(-1/2 X^T Σ^(-1) x)
= 1/[2πσ²√(1-ρ²)] exp{-[X1²+X2²-2ρX1X2]/[2σ²(1-ρ)²]}
-where ρ is the correlation between X1 and X2

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Bivariate Normal Distribution

Most General

A

-assume X_~N(μ_,Σ)
-μ_ is a 2x1 vector of matrices μ1 and μ2
-Σ is a 2x2 matrix with entries σ1², ρσ1σ2, ρσ1σ2, σ2²
f(x1,x2) = 1/[2πσ1σ2] exp{-1/[2(1-ρ)²] (X1-μ1)²/σ1² - 2(X1-μ1)(X2-μ2)/σ1σ2 + (X2-μ2)²/σ2²}

How well did you know this?
1
Not at all
2
3
4
5
Perfectly