Week 5 notes Flashcards
How is the expected value of a random variable defined?
If X is a discrete random variable with possible values x1, x2, …, xi, … , and probability mass function p(x), then the expected value (or mean) of X is given by
μ = μ_X = E[X] = the summation of xi * p(xi)
How do we find the expected value of a function of a random variable?
Theorem 1: Let X be a random variable and let g be a real-valued function. If X is a discrete random variable with possible values x1, x2, … , xi, … , and probability mass function p(x), then the expected value of Y is given by:
E[g(X)] = the summation of (g(xi)*p(xi))
How do we find the expected value of a linear function of a random variable?
A special case of Theorem 1: Let X be a random variable. If g is a linear function, i.e., g(x) = ax+b, then:
E[g(X)] = E[aX+b] = a*E[X] + b
Linearity of Expectation
Let X be a random variable, c, c1, c2 constants, and g, g1, g2 functions. Then expectations E[ * ] satisfies the following:
1: E[c] = c
2: E[cg(X)] = cE[g(X)]
3: E[c1g1(X) + c2g2(X)] = c1E[g1(X)] + c2E[g2(X)]
How is the variance of a random variable defined?
The variance of a random variable X is given by:
(Sigma)^2 = Var(X) = E[(X- (mu))^2] = E[(X-E[X])^2]
Where mu denotes the expected value of X. The standard deviation of X is given by:
sigma = SD(X) =sqrt(Var(X))
Forms of the equation for Variance
Theorem 2
Var(X) = summation of (xi-mu)^2 * p(xi) Var(X) = E[X^2] - (mu)^2
Linearity of Variance
Let X be a random variable, and a, b be constants. Then the following holds:
Var(aX+b) = (a^2)Var(X)
How is the rth moment of a random variable defined?
The rth moment of a random variable X is given by:
E[X^r]
How is the rth central moment of a random variable defined?
The rth central moment of a random variable X is given by:
E[(X-mu)^r]
where mu = E[X]
How is the moment-generating function of a random variable defined?
The moment-generating function (mgf) of a random variable X is given by:
M_X (t) = E[e^(t*X)], for t is a real number
Theorem 4
If random variable X has mgf M_X (t), then
the rth derivative of M_X (0) = E[X^r]
In other words, the rth derivative of the mgf evaluated at t=0 gives the value of the rth moment
This leads to the following equation
M_X (t) = E[e^(tX)] = the summation of (e^(txi) * p(xi))
The linearity of the mgf
Let X be a random variable with mgf M_X (t), and let a, b be constants. If random variable Y = aX + b, then the mgf of Y is given by
M_Y (t) = e^(bt) * M_X (at)
Theorem 6
If X1, … , Xn are independent variables with mgf’s M_X1 (t), … , M_Xn (t), respectively, then the mgf of random variable Y = X1 + … + Xn is given by:
M_Y (t) = M_X1 (t) *** M_Xn (t)
Theorem 7
The mgf M_X (t) of random variable X uniquely determines the probability distribution of X. In other words, if random variables X and Y have the same mfd, M_X (t) = M_y (t), then X and Y have the same probability distribution.