Random vectors and Joint distribution Flashcards

1
Q

Definition Joint Density

A

If X and Y are discrete random variables, also (X, Y) is discrete, and we denote ƿ(x,y) its density

More explicitly

p(x,y) = P(X=x, Y=y)

called the joint density of X and Y

▪ In this context the densities ƿₓ and ƿy are called marginal densities

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Marginal densities
▪ Theorems

A

1)
▪ pₓ (x) = Σy pₓ,y (x,y)
▪ py (y) = Σₓ pₓ,y (x,y)

2) Let X and Y be two discrete random variables, taking values in E and F respectively, and

▪ f: E * F –> R then

▪ E(f(x,y)) = Σₓ,y f(x,y) pₓ,y (x,y)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Independent random variables
▪ Definition

A

Let:
▪ X be a random variable with values in E
▪ Y be a random variable with values in F

X,Y are independent if for every
▪ A ⊆ E
▪ B ⊆ F

P(XϵA, YϵB) = P(XϵA) * P(YϵB)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Independent random variables
▪ Theorem

A

1)
X and Y are independent iff
▪ ∀XϵA, ∀YϵB
▪ pₓ,y(x,y) = pₓ(x) * py(y)

2)
Let X,Y
▪ independent
▪ discrete
▪ real valued random variables
Then

E(X,Y) = E(X) * E(Y)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Covariance
▪ Definition

A

Let X,Y be two real valued random variables
The covariance between X and Y is defined by

Cov(X,Y) = E[(X-E(X)(Y-E(Y))]
|
Cov(X,Y) = E(XY) - E(X)
E(Y)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Covariance
▪ Properties

A

1) Cov(X,Y) = Cov(Y, X)

2) Cov(X,aY + bZ) = aCov(X,Y)+bCov(X,Z)

3) Cov(X,X) = Var(X)

4) If X and Y are independent the Cov(X,Y)=0

If Cov(X,Y) = 0 we say that X and Y are uncorrelated

We have seen that independent –> uncorrelated
(but the inverse is not necessarily true)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Correlation coefficient
▪ Definition

A

The correlation coefficient detects exact or approximate linear relations between random variables

⍴(X,Y) = ( Cov(X,Y) ) / ( √Var(X) * √Var(Y) )

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Correlation coefficient
▪ Properties

A

▪ ⍴(aX + b, cY + d) = ⍴(X,Y)
▪ -1 ⩽ ⍴(X,Y) ⩽ 1
▪ ⍴(X,Y) = 1 iff Y=aX + b for some a>0
▪ ⍴(X,Y) = -1 iff Y=aX + b for some a<0

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Correlation coefficient
▪ Theorem and Corollary

A

Theorem
Var(X+Y)= Var(X) + Var(Y) + 2Cov(X,Y)

Corollary
Var(X+Y)= Var(X) + Var(Y)

If X and Y are uncorrelated (in particular if they are independent)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Random vectors
▪ Notions and results of any dimensions

A

▪ If X₁ ,X₂, …, Xₙ are discrete random variables, their joint density is defined by

px₁ ,x₂, …, xₙ (x₁ ,x₂, …, xₙ) =
P(X₁ = x₁ ,X₂ = x₂, …, Xₙ = xₙ)

▪ pxi = Σx₁,x₂,…,xₙ p(x₁ ,x₂, …, xₙ)

▪ E[f(x₁ ,x₂, …, xₙ)] =
Σx₁,x₂,…,xₙ f(x₁ ,x₂, …, xₙ)*px₁,x₂,…,xₙ(x₁ ,x₂, …, xₙ)

▪ X₁ ,X₂, …, Xₙ are said to be independent for all choices of A₁ ,A₂, …, Aₙ

P( X₁ϵ A₁, X₂ϵ A₂, …, Xₙϵ Aₙ) = ℿₖ P(Xₖ ϵ Aₖ)

px₁ ,x₂, …, xₙ (x₁ ,x₂, …, xₙ) = ℿ pₓₖ (xₖ)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly