4. Expected Values Flashcards

1
Q

Expectation

Discrete Case

A

E(X) = Σ xi*p(xi)
-limitation, if it’s a finite sum and the xi are both positive and negative, the sum could fail to converge, we restrict to cases where the sum converges absolutely:
Σ |xi| p(xi) < ∞
-otherwise the expectation is undefined

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Expectation

Continuous Case

A

E(X) = ∫ x*f(x) dx

  • integral between -∞ and +∞
  • if ∫ x*f(x) dx = ∞ we say that the expectation is undefined
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Expectation of Gamma Distribution

A

E(X) = α/λ

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Expectation of Exponential Distribution

A

-the exponential distribution is just the gamma distribution with parameter α=1
E(X) = 1/λ

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Expectation of Normal Distribution

A

E(X) = µ

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Expectations of Functions of Random Variables

Theorem A - Discrete Case

A

-let g(x) be a fixed function
E(g(x)) = Σ g(xi) p(xi)
-with limitation Σ g(xi) p(xi) < ∞

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Expectations of Functions of Random Variables

Theorem A - Continuous Case

A

-let g(x) be a fixed function
E(g(x)) = ∫ g(x)*f(x) dx
-with limitation Σ g(x) f(x) < ∞

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Expectations of Functions of Random Variables

Theorem B - Discrete Case

A

-suppose X1,X2,…,Xn are jointly distributed random variables, let Y=g(X1,…,Xn)
E(Y) = Σ g(x1,…,xn) p(x1,…,xn)
-with:
Σ |g(x1,…,xn)| p(x1,…,xn) < ∞

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Expectations of Functions of Random Variables

Theorem B - Continuous Case

A

-suppose X1,X2,…,Xn are jointly distributed random variables, let Y=g(X1,…,Xn)
E(Y) = ∫…∫ g(x1,…,xn) f(x1,…,xn) dx1….dxn
-with:
∫…∫ |g(x1,…,xn)| f(x1,…,xn) dx1…dxn < ∞

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Expectations of Functions of Random Variables

Theorem C

A

-suppose X1,…,Xn are jointly distributed random variables with expectation E(Xi) and Y = a + Σ biXi
-then:
E(Y) = a + Σ bi
E(Xi)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Variance

Definition

A

-the variance of a random variable X is defined as :

Var(X) = E{X-E(X)}²

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Standard Deviation

Definition

A

σ = √Var(X)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Variance

Y= a + bX Theorem

A

-if Y = a + bX, then:

Var(Y) = b²Var(X)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Variance

Alternative Form Theorem

A

-the variance of X if it exists may also be computed by:

Var(X) = E(X²) - (E(X))²

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Covariance

Definition

A

-for jointly continuous random variables X and Y, we define the covariance of X and Y as:
Cov(X,Y) = E(XY) - E(X)E(Y)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Relationship Between Variance and Covariance

A

-variance is just the covariance of a random variable with itself:
Var(X) = Cov(X,X)

17
Q

Correlation Coefficient

Definition

A

-we define the correlation coefficient of X and Y:
ρ = Cor(X,Y) = Cov(X,Y) / √[Var(X)Var(Y)]
-it measures the linear relationship between two random variables and takes values:
ρ ≤ 1

18
Q

Variance of the Sum of Random Variables

A
-suppose that Xi is a sequence of random variables with joint pdf:
f(x_) = f(x1,x2,...,xn)
-we have:
var(ΣXi) = Σ var(Xi) + 2 Σ cov(Xi,Xj)
-where the first sum if from i=1 to i=N
-and the second is for all i
19
Q

Independence and Covariance

A

-suppose X and Y are independent random variables, then:
Cov(X,Y) = 0
-BUT zero covariance does not generally imply independence!!

20
Q

Covariance of the Sum

A
-suppose:
U = a + ΣbiXi
-and
V =c + ΣdjYj
-then the covariance :
Cov(U,V) = ΣΣbidjcov(Xi,Yj)
-where the sums are from i=1 to i=n, and j=1 to j=m
21
Q

Moment

Definition

A

-suppose X is a random variable with pdf f, define:
E(X^n) = ∫ x^n f(x) dx
-where the integral is from -∞ to +∞
-we call E(X^n) the nth moment of X

22
Q

Variance

moment Definition

A

Var(X) = second moment - (first moment)²

23
Q

Moment Generating Function

Definition

A

-suppose X is a random variable with pdf f
-let M(t) = E(e^(tX))
M(t) = ∫ e^(tX) f(x) dx
-where the integral is from -∞ to +∞
-we call M(t) the generating function because:
∂^n M(t) / dt^n |0 = E(X^n)
-i.e. the nth derivative of M(t) with respect to n evaluated at t=0 is the expectation of X to the power n, or the nth moment of X

24
Q

Two Functions With the Same Generating Function

A

-if two functions have the same moment generating function Fx(x) = Fy(y) for ALMOST all x

25
Q

Moments of Sums of Independent Random Variables

A

-suppose Xi are a sequence of independent random variables, define:
Y = Σ Xi
-then:
My(t) = ∏ Mxi(t)

26
Q

Facts About Moment Generating Functions

A
M_aY = M_Y(at)
M_(Y+a) = e^(at) M_Y(t)