Endterm Flashcards

1
Q

What can you say about the density frunction of Y=g(x) if g is differentiable and one-to-one?

A

fy(y) = fx(g-1(y)) * 1/ |g’(g-1(y))|

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What can you say if g is differenciable and strictly increasing? strictly decreasing?

A
Fy(y) = Fx(g-1(y))
Fy(y) = 1 - Fx(g-1(y))
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What is a joint probability mass function (discrete random variable)?

A

P(k1, k2, … , kn) = P(X=k1, X=k2, … , Xn=kn)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What is the marginal probability mass function if Xj?

A

PXj(k) = Σ (l1, … , lj-1, lj+1, …, ln) p(l1, … , lj-1, k, lj+1, …, ln)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

How can you find E[g(X1,…,Xn)]?

A

E[g(X1,…,Xn)]=Σ(k1,…,kn) g(k1,…,kn)*p(k1,…,kn)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

When are two continuous random variables X and Y jointly continuous?

A

If joint density function fX,Y st

P((X,Y)∈B) = ∫∫(B) fX,Y(x,y) dx dy

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

What properties does the density function f(x,y) has?

A

f(x,y)>=0

∫∫(both -oo to +oo) f(x,y) dx dy = 1

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

What is the marginal density function of X?

A

fX(x) = ∫(-oo to +oo) fX,Y(x,y) dy

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

What is E[g(X,Y)] for continuous random variables?

A

E[g(X,Y)] = ∫∫(-oo to +oo) g(x,y) * fX,Y(x,y) dx dy

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

What can you say about the joint distribution functions (for discrete and continuous) if X and Y are independent?

A
if X and Y independent then
P(X∈A, Y∈B) = P(X∈A)*P(Y∈B)
so
discrete : PX,Y(x,y) = PX(x) * PY(y)
continuous : fX,Y(x,y) = fX(x) * fY(y)
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

What is a joint cumulative distribution function?

A

F(S1,…,Sn) = P(X1<=S1,…,Xn<=Sn)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

What is the probability mass function of X+Y?

A

PX+Y(n) = PX * PY(n) = Σ(k) PX(k)PY(n-k) = Σ(m) PX(n-m)PY(m)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

What is the density function of X+Y?

A

fX+Y(z) = fXfY(z) =∫(-oo to oo)fX(x)fY(z-x)dx =∫(-oo to oo)fX(z-x)*fY(x)dx

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

What does the linearity of expectation say?

A

E[g1(x1) + g2(x2) + … + gn(xn)] = E[g1(x1)] + E[g2(x2)] + … + E[gn(xn)]
E(X1+…+Xn) = E(X1) + E(X2) + … + E(Xn)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

What can you say about expectation if Xn are independent rv

A

the expectation of the product of functions of independent rv is the product of the expectations of the functions of independent rv

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

What is the covariance?

A

Cov(X,Y) = E(XY) - E(X)*E(Y)

17
Q

What can you say if Cov(X,Y)>0? <0? =0?

A

> 0 : X and Y deviate together below or above their means
<0 : X and Y deviate in opposite directions
=0 : X and Y are independent

18
Q

What are the properties of the covariance?

A

Cov(X,Y)=Cov(Y,X)
Cov(X,X) = Var(X)
Cov(aX+b,Y)=aCov(X,Y)

19
Q

What is the correlation of X and Y?

A

Corr(X,Y) = Cov(X,Y) / sqrt(Var(X))*sqrt(Var(Y))

-1<=Corr(X,Y)<=1

20
Q

What does the central limit theorem say?

A

X1, X2,… independent identically distributed and E(Xi)=u, Var(Xi)=s^2 and a<=b
lim(n–>oo) p(a<=X1+…+Xn-nu/s*sqrt(n) <=b)
= ∫(a,b) 1/sqrt(2pi) * e(-1/2 x^2) dx
= o/ (b) - o/(a)

21
Q

How can you find E(X) from the conditional expectation?

A

E(X) = Σ(i=1 to n) E(X/Bi) * P(Bi)

22
Q

What is the conditional probability mass function of X given Y=y? the expectation? the expectation of X? PX(x)? PX,Y(x,y)?

A

PX/Y(x/y) = P(X=x/Y=y)= P(X=x,Y=y)/P(Y=y) = PX,Y(x,y) / PY(y)

E(X/Y=y) = Σ(x) x*PX/Y(x/y)

E(X)=Σ(y) E(X/Y=y)*PY(y)

PX(x) = Σ(y) PX/Y(x/y)*PY(y)= Σ(y) PX,Y(x,y)

PX,Y(x,y) = PX/Y(x/y)*PY(y)

23
Q

What is the memoriless property?

A

P(X>t+s / X>t) = P(X>s)