Week 5 notes Flashcards

1
Q

How is the expected value of a random variable defined?

A

If X is a discrete random variable with possible values x1, x2, …, xi, … , and probability mass function p(x), then the expected value (or mean) of X is given by
μ = μ_X = E[X] = the summation of xi * p(xi)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

How do we find the expected value of a function of a random variable?

A

Theorem 1: Let X be a random variable and let g be a real-valued function. If X is a discrete random variable with possible values x1, x2, … , xi, … , and probability mass function p(x), then the expected value of Y is given by:
E[g(X)] = the summation of (g(xi)*p(xi))

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

How do we find the expected value of a linear function of a random variable?

A

A special case of Theorem 1: Let X be a random variable. If g is a linear function, i.e., g(x) = ax+b, then:
E[g(X)] = E[aX+b] = a*E[X] + b

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Linearity of Expectation

A

Let X be a random variable, c, c1, c2 constants, and g, g1, g2 functions. Then expectations E[ * ] satisfies the following:

1: E[c] = c
2: E[cg(X)] = cE[g(X)]
3: E[c1g1(X) + c2g2(X)] = c1E[g1(X)] + c2E[g2(X)]

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

How is the variance of a random variable defined?

A

The variance of a random variable X is given by:
(Sigma)^2 = Var(X) = E[(X- (mu))^2] = E[(X-E[X])^2]
Where mu denotes the expected value of X. The standard deviation of X is given by:
sigma = SD(X) =sqrt(Var(X))

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Forms of the equation for Variance

Theorem 2

A
Var(X) = summation of (xi-mu)^2 * p(xi)
Var(X) = E[X^2] - (mu)^2
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Linearity of Variance

A

Let X be a random variable, and a, b be constants. Then the following holds:
Var(aX+b) = (a^2)Var(X)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

How is the rth moment of a random variable defined?

A

The rth moment of a random variable X is given by:

E[X^r]

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

How is the rth central moment of a random variable defined?

A

The rth central moment of a random variable X is given by:
E[(X-mu)^r]
where mu = E[X]

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

How is the moment-generating function of a random variable defined?

A

The moment-generating function (mgf) of a random variable X is given by:
M_X (t) = E[e^(t*X)], for t is a real number

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Theorem 4

A

If random variable X has mgf M_X (t), then
the rth derivative of M_X (0) = E[X^r]
In other words, the rth derivative of the mgf evaluated at t=0 gives the value of the rth moment
This leads to the following equation
M_X (t) = E[e^(tX)] = the summation of (e^(txi) * p(xi))

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

The linearity of the mgf

A

Let X be a random variable with mgf M_X (t), and let a, b be constants. If random variable Y = aX + b, then the mgf of Y is given by
M_Y (t) = e^(bt) * M_X (a
t)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Theorem 6

A

If X1, … , Xn are independent variables with mgf’s M_X1 (t), … , M_Xn (t), respectively, then the mgf of random variable Y = X1 + … + Xn is given by:
M_Y (t) = M_X1 (t) *** M_Xn (t)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Theorem 7

A

The mgf M_X (t) of random variable X uniquely determines the probability distribution of X. In other words, if random variables X and Y have the same mfd, M_X (t) = M_y (t), then X and Y have the same probability distribution.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly