Linear combinations of random variables Flashcards

1
Q

Mean of a function of a random variable 𝑬(𝑿) = Β΅

A

Two or more samples are often combined in some way to give a composite sample. The expected values for such samples have predictable properties.

𝐸(π‘Ž) = π‘Ž
The mean value of a constant π‘Ž is π‘Ž.
𝐸(π‘Žπ‘‹) = π‘Ž Γ— 𝐸(𝑋)
If each value in a probability distribution is multiplied by π‘Ž
the mean of distribution will be multiplied by a factor of π‘Ž.
𝐸(𝑋 + 𝑏) = 𝐸(𝑋) + 𝑏
If a constant value 𝑏 is added or subtracted from each value
in probability distribution, the mean of the distribution will be
increased or decreased by 𝑏.
∴ 𝐸(π‘Žπ‘‹ + 𝑏) = π‘Ž Γ— 𝐸(𝑋) + 𝑏

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Variance of a function of a random variable 𝑽𝒂𝒓(𝑿) = 𝝈²

A

We have seen how the expected value of π‘Žπ‘‹ + 𝑏 is related to the expected value of 𝑋 the
result is 𝐸(π‘Žπ‘‹ + 𝑏) = π‘Ž Γ— 𝐸(𝑋) + 𝑏. There is a similar result for the variance of π‘Žπ‘‹ + 𝑏.

π‘‰π‘Žπ‘Ÿ(π‘Ž) = 0
The variance of a constant π‘Ž is 0.
𝑆𝑑(π‘Ž) = 0
The standard deviation of a constant π‘Ž is 0.
π‘‰π‘Žπ‘Ÿ(π‘Žπ‘‹) = π‘ŽΒ² Γ— π‘‰π‘Žπ‘Ÿ(𝑋)
If each value in a probability distribution is multiplied by π‘Ž the variance of distribution will be
multiplied by a factor ofπ‘ŽΒ²

.
𝑆𝑑(π‘Žπ‘‹) = |π‘Ž| Γ— 𝑆𝑑(𝑋)
If each value in a probability distribution is multiplied by π‘Žthe standard deviation of distribution
will be multiplied by a factor ofΘπ‘ŽΘ.
π‘‰π‘Žπ‘Ÿ(𝑋 + 𝑏) = π‘‰π‘Žπ‘Ÿ(𝑋)
If a constant value 𝑏 is added or subtracted from each value in probability distribution, the
variance of the distribution will be unchanged.
𝑆𝑑(𝑋 + 𝑏) = 𝑆𝑑(𝑋)
If a constant value 𝑏 is added or subtracted from each value in probability distribution, the
variance of the distribution will be unchanged.
∴ π‘‰π‘Žπ‘Ÿ(π‘Žπ‘‹ + 𝑏) = π‘Ž
2 Γ— π‘‰π‘Žπ‘Ÿ(𝑋)
∴ 𝑆𝑑(π‘Žπ‘‹ + 𝑏) = |π‘Ž| Γ— 𝑆𝑑(𝑋)

Note: π‘‰π‘Žπ‘Ÿ(𝑋) = 𝐸(𝑋²) – [E(X)]Β²

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Sum and difference of independent random variables

A

If X and Y are 2 independent random variables, the expectation of the sum and difference of X and Y is the sum and difference of the 2 single variance.

If X and Y are 2 independent random variables, the variance of the sum and difference of X and Y is the sum of the 2 single variance.

For T = X + Y
E(T) = E(X + Y) = E(X) + E(Y)
Var(T) = Var(X + Y) = Var(X) + Var(Y)

For T = X - Y
E(T) = E(X - Y) = E(X) - E(Y)
Var(T) = Var(X - Y) = Var(X) + Var(Y)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Linear combination of random variables

A

If X and Y are independent random variables, and Z = aX + bY then:

For any random variables:
E(aX + bY) = a x E(x) + b x E(Y)

For independent random variables:
Var(aX + bY) = aΒ² x Var(X) + bΒ² x Var(Y)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Linear functions and combinations of normally distributed random variables

A

Linear combination of independent normal variables are also normally distributed

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

The distribution of the sum of 2 independent Poisson variables

A

If you have 2 poisson variables,
X~Po(Ξ»β‚“) and Y~Po(λᡧ) then,
E(X + Y) = E(X) + E(Y) = Ξ»β‚“ + λᡧ

Since with poisson variables the variances are equal to their corresponding means,
Var(X + Y) = Var(X) + Var(Y) = Ξ»β‚“ + λᡧ

Mean and variance of X and Y are equal hence X + Y have a poisson distribution given that X and Y are independent

Note:
Linear combination of independent poisson variables of the for aX + bY can not have a poisson distribution

How well did you know this?
1
Not at all
2
3
4
5
Perfectly