4. Expected Values Flashcards
Expectation
Discrete Case
E(X) = Σ xi*p(xi)
-limitation, if it’s a finite sum and the xi are both positive and negative, the sum could fail to converge, we restrict to cases where the sum converges absolutely:
Σ |xi| p(xi) < ∞
-otherwise the expectation is undefined
Expectation
Continuous Case
E(X) = ∫ x*f(x) dx
- integral between -∞ and +∞
- if ∫ x*f(x) dx = ∞ we say that the expectation is undefined
Expectation of Gamma Distribution
E(X) = α/λ
Expectation of Exponential Distribution
-the exponential distribution is just the gamma distribution with parameter α=1
E(X) = 1/λ
Expectation of Normal Distribution
E(X) = µ
Expectations of Functions of Random Variables
Theorem A - Discrete Case
-let g(x) be a fixed function
E(g(x)) = Σ g(xi) p(xi)
-with limitation Σ g(xi) p(xi) < ∞
Expectations of Functions of Random Variables
Theorem A - Continuous Case
-let g(x) be a fixed function
E(g(x)) = ∫ g(x)*f(x) dx
-with limitation Σ g(x) f(x) < ∞
Expectations of Functions of Random Variables
Theorem B - Discrete Case
-suppose X1,X2,…,Xn are jointly distributed random variables, let Y=g(X1,…,Xn)
E(Y) = Σ g(x1,…,xn) p(x1,…,xn)
-with:
Σ |g(x1,…,xn)| p(x1,…,xn) < ∞
Expectations of Functions of Random Variables
Theorem B - Continuous Case
-suppose X1,X2,…,Xn are jointly distributed random variables, let Y=g(X1,…,Xn)
E(Y) = ∫…∫ g(x1,…,xn) f(x1,…,xn) dx1….dxn
-with:
∫…∫ |g(x1,…,xn)| f(x1,…,xn) dx1…dxn < ∞
Expectations of Functions of Random Variables
Theorem C
-suppose X1,…,Xn are jointly distributed random variables with expectation E(Xi) and Y = a + Σ biXi
-then:
E(Y) = a + Σ biE(Xi)
Variance
Definition
-the variance of a random variable X is defined as :
Var(X) = E{X-E(X)}²
Standard Deviation
Definition
σ = √Var(X)
Variance
Y= a + bX Theorem
-if Y = a + bX, then:
Var(Y) = b²Var(X)
Variance
Alternative Form Theorem
-the variance of X if it exists may also be computed by:
Var(X) = E(X²) - (E(X))²
Covariance
Definition
-for jointly continuous random variables X and Y, we define the covariance of X and Y as:
Cov(X,Y) = E(XY) - E(X)E(Y)