Multivariate Distributions Flashcards
Joint distribution function
F_X,Y (x,y)
For vector( X) = (X,Y)
Double integral
∫[-∞ ,y] ∫[-∞ ,x] f_X,Y (x,y) .dx .dy
• f(x,y) bigger than or equal to 0 for all x,y in R
• double integral from -∞ to ∞ and -∞ to ∞
Of f(x,y).dx.dy =1
Probabilities:P((X,Y) in D) =
Double integral over region D of fX,Y(x,y) .dx.dy
D must be within original región
Beware discrete case not ≤
Expectation of a multivatiate
E(g(X,Y))
E(g(X,Y))
Double integral from -∞ to ∞ and -∞ to ∞
Of
g(x,y)• f(x,y).dx.dy
Marginal PDF of joint distribution
Marginal PDF of x , integrate out y
fX(x) =
∫[-∞, ∞] fX,Y(x,y) .dy
Limits of y.
Conditional PDF of X given Y=y
f_ X/Y=y (x)
=
f_X,Y (x,y)
\
f_Y (y)
Independence of (X,Y)
A pair (X,Y) are independent IF AND ONLY IF
P[ X∈A, Y∈B] = P[ X∈A]•P[Y∈B].
For all A,B subset of R
Cov(X,Y) =0 doesn’t imply independence but if independent then cov 0
Lemma 4.6 : independence for joint distribution
If fX(x) and fY(y). Are pdfs
Then for continuous cars
Independent if and only if
fX,Y(x,y = fx(x)•fY(y)
Ie test for independence if the joint distribution PDF = g(x)•h(y) separable prod of functions
Correlation coefficient
Correlation coefficient
Cov(X,Y) /. SQRT( Var(X)•Var(Y))
Cov(X,Y)
= E[(X- mu_x)(Y- mu_y)]
= E[XY] - E[X]E[Y]
E[XY] = double integral over region of xy •joint PDF
Conditional expectation of X given Y=y
E[X\Y=y]
=
Integral from -infinity to infinity
Of
x•f_X\Y=y (x) .dx
Limits wrt x
Will be a function of y
Find conditional PDF from joint/marginal
Conditional variance of X\Y
Conditional covariance of X, Y/Z
Var(X\Y)
E[ (X-E[X\Y])^2. \Y]
Cov(X,Y\Z)
=. E[ XY\Z] - E[X\Z]•E[Y\Z]
Lemma 4.10 used to find mean and variance of X by conditioning
Eg if we know conditional distributions
1) E[X] = E[E[X\Y]] 2) Var(X) = E[Var(X\Y)] + Var [E[X\Y]] 3)
Cov(X,Y)
= E( Cov(X,Y\Z)) + Cov( E[X\Z], E[Y\Z])
Transformations of bivariate distributions
(Multivariate£
Define transformations u=u(x,y) and v=v(x,y) to be continuous differentiable and one to one
(x to u and v)
U= u(X,Y) , V=v(X,Y)
If inverse exists x=x(u,v) and y= y(u,v)
Then joint PDF dist of transformed
f_U,V(u,v)
=
{f_XY(x(u,v),y(u,v)) • modulus of det (J). For all (u,v) in image of f
{ 0 o/w
J= matrix ( partial x wrt u. Partial x wrt v)
(Partial y wrt u partial y wrt v)
- joint PDF of X,Y
- transformation and inverse calc mod of J
- region (u,v) corresponding to transformation for which joint PDF f_X,Y is bigger than 0
Sketch!!!
Transformations of bivariate tricks:
If only one transformed set V=X then integrate out x for marginal PDF
• may need to find joint PDF by independence
Students t distribution
Sample of normals with unknown parameters.
Sample mean and unbiased sample variance ( S^2 = (1/(n-1)) sum of (X_i - x bar)^2
- (n-1)S^2 / sigma ^2 has chi squared dist with parameter n-1
- (SQRT (n) / sigma )• (X bar - mu) has standard normal
- z standard normal. W chi sqrd with n independent then
Z/ ( sqrtW/ SQRTn). ~ t_n dist
Multivariate normal distribution
Expectation vector
Covariance matrix
E(X) = ( E(x_1),…,E(x_k))^T
= vector(mu) = (mu_1,…,mu_k)^T
Cov(X) = ( cov(X_1,X_1) cov(X_1,X_2,….) first row
Etc k by k matrix
Diagonals are variances σ_ii= σ^2_i
σ_ij = p_ij• σ_i • σ_j
P_ij correlation coefficent
Lemma 6.3 transformation of Multivariate normal
E[Y] cov(Y)
Affine trans
Y= AX +b
2x2 matrix A ,
2x1 b
E[Y] = AE[X] +b
Cov(Y) = ACov(X) A^T
Transpose!!!