4. Expected Values Flashcards
Expectation
Discrete Case
E(X) = Σ xi*p(xi)
-limitation, if it’s a finite sum and the xi are both positive and negative, the sum could fail to converge, we restrict to cases where the sum converges absolutely:
Σ |xi| p(xi) < ∞
-otherwise the expectation is undefined
Expectation
Continuous Case
E(X) = ∫ x*f(x) dx
- integral between -∞ and +∞
- if ∫ x*f(x) dx = ∞ we say that the expectation is undefined
Expectation of Gamma Distribution
E(X) = α/λ
Expectation of Exponential Distribution
-the exponential distribution is just the gamma distribution with parameter α=1
E(X) = 1/λ
Expectation of Normal Distribution
E(X) = µ
Expectations of Functions of Random Variables
Theorem A - Discrete Case
-let g(x) be a fixed function
E(g(x)) = Σ g(xi) p(xi)
-with limitation Σ g(xi) p(xi) < ∞
Expectations of Functions of Random Variables
Theorem A - Continuous Case
-let g(x) be a fixed function
E(g(x)) = ∫ g(x)*f(x) dx
-with limitation Σ g(x) f(x) < ∞
Expectations of Functions of Random Variables
Theorem B - Discrete Case
-suppose X1,X2,…,Xn are jointly distributed random variables, let Y=g(X1,…,Xn)
E(Y) = Σ g(x1,…,xn) p(x1,…,xn)
-with:
Σ |g(x1,…,xn)| p(x1,…,xn) < ∞
Expectations of Functions of Random Variables
Theorem B - Continuous Case
-suppose X1,X2,…,Xn are jointly distributed random variables, let Y=g(X1,…,Xn)
E(Y) = ∫…∫ g(x1,…,xn) f(x1,…,xn) dx1….dxn
-with:
∫…∫ |g(x1,…,xn)| f(x1,…,xn) dx1…dxn < ∞
Expectations of Functions of Random Variables
Theorem C
-suppose X1,…,Xn are jointly distributed random variables with expectation E(Xi) and Y = a + Σ biXi
-then:
E(Y) = a + Σ biE(Xi)
Variance
Definition
-the variance of a random variable X is defined as :
Var(X) = E{X-E(X)}²
Standard Deviation
Definition
σ = √Var(X)
Variance
Y= a + bX Theorem
-if Y = a + bX, then:
Var(Y) = b²Var(X)
Variance
Alternative Form Theorem
-the variance of X if it exists may also be computed by:
Var(X) = E(X²) - (E(X))²
Covariance
Definition
-for jointly continuous random variables X and Y, we define the covariance of X and Y as:
Cov(X,Y) = E(XY) - E(X)E(Y)
Relationship Between Variance and Covariance
-variance is just the covariance of a random variable with itself:
Var(X) = Cov(X,X)
Correlation Coefficient
Definition
-we define the correlation coefficient of X and Y:
ρ = Cor(X,Y) = Cov(X,Y) / √[Var(X)Var(Y)]
-it measures the linear relationship between two random variables and takes values:
ρ ≤ 1
Variance of the Sum of Random Variables
-suppose that Xi is a sequence of random variables with joint pdf: f(x_) = f(x1,x2,...,xn) -we have: var(ΣXi) = Σ var(Xi) + 2 Σ cov(Xi,Xj) -where the first sum if from i=1 to i=N -and the second is for all i
Independence and Covariance
-suppose X and Y are independent random variables, then:
Cov(X,Y) = 0
-BUT zero covariance does not generally imply independence!!
Covariance of the Sum
-suppose: U = a + ΣbiXi -and V =c + ΣdjYj -then the covariance : Cov(U,V) = ΣΣbidjcov(Xi,Yj) -where the sums are from i=1 to i=n, and j=1 to j=m
Moment
Definition
-suppose X is a random variable with pdf f, define:
E(X^n) = ∫ x^n f(x) dx
-where the integral is from -∞ to +∞
-we call E(X^n) the nth moment of X
Variance
moment Definition
Var(X) = second moment - (first moment)²
Moment Generating Function
Definition
-suppose X is a random variable with pdf f
-let M(t) = E(e^(tX))
M(t) = ∫ e^(tX) f(x) dx
-where the integral is from -∞ to +∞
-we call M(t) the generating function because:
∂^n M(t) / dt^n |0 = E(X^n)
-i.e. the nth derivative of M(t) with respect to n evaluated at t=0 is the expectation of X to the power n, or the nth moment of X
Two Functions With the Same Generating Function
-if two functions have the same moment generating function Fx(x) = Fy(y) for ALMOST all x
Moments of Sums of Independent Random Variables
-suppose Xi are a sequence of independent random variables, define:
Y = Σ Xi
-then:
My(t) = ∏ Mxi(t)
Facts About Moment Generating Functions
M_aY = M_Y(at) M_(Y+a) = e^(at) M_Y(t)