Linear combinations of random variables Flashcards
Mean of a function of a random variable π¬(πΏ) = Β΅
Two or more samples are often combined in some way to give a composite sample. The expected values for such samples have predictable properties.
πΈ(π) = π
The mean value of a constant π is π.
πΈ(ππ) = π Γ πΈ(π)
If each value in a probability distribution is multiplied by π
the mean of distribution will be multiplied by a factor of π.
πΈ(π + π) = πΈ(π) + π
If a constant value π is added or subtracted from each value
in probability distribution, the mean of the distribution will be
increased or decreased by π.
β΄ πΈ(ππ + π) = π Γ πΈ(π) + π
Variance of a function of a random variable π½ππ(πΏ) = πΒ²
We have seen how the expected value of ππ + π is related to the expected value of π the
result is πΈ(ππ + π) = π Γ πΈ(π) + π. There is a similar result for the variance of ππ + π.
πππ(π) = 0
The variance of a constant π is 0.
ππ(π) = 0
The standard deviation of a constant π is 0.
πππ(ππ) = πΒ² Γ πππ(π)
If each value in a probability distribution is multiplied by π the variance of distribution will be
multiplied by a factor ofπΒ²
.
ππ(ππ) = |π| Γ ππ(π)
If each value in a probability distribution is multiplied by πthe standard deviation of distribution
will be multiplied by a factor ofΘπΘ.
πππ(π + π) = πππ(π)
If a constant value π is added or subtracted from each value in probability distribution, the
variance of the distribution will be unchanged.
ππ(π + π) = ππ(π)
If a constant value π is added or subtracted from each value in probability distribution, the
variance of the distribution will be unchanged.
β΄ πππ(ππ + π) = π
2 Γ πππ(π)
β΄ ππ(ππ + π) = |π| Γ ππ(π)
Note: πππ(π) = πΈ(πΒ²) β [E(X)]Β²
Sum and difference of independent random variables
If X and Y are 2 independent random variables, the expectation of the sum and difference of X and Y is the sum and difference of the 2 single variance.
If X and Y are 2 independent random variables, the variance of the sum and difference of X and Y is the sum of the 2 single variance.
For T = X + Y
E(T) = E(X + Y) = E(X) + E(Y)
Var(T) = Var(X + Y) = Var(X) + Var(Y)
For T = X - Y
E(T) = E(X - Y) = E(X) - E(Y)
Var(T) = Var(X - Y) = Var(X) + Var(Y)
Linear combination of random variables
If X and Y are independent random variables, and Z = aX + bY then:
For any random variables:
E(aX + bY) = a x E(x) + b x E(Y)
For independent random variables:
Var(aX + bY) = aΒ² x Var(X) + bΒ² x Var(Y)
Linear functions and combinations of normally distributed random variables
Linear combination of independent normal variables are also normally distributed
The distribution of the sum of 2 independent Poisson variables
If you have 2 poisson variables,
X~Po(Ξ»β) and Y~Po(λᡧ) then,
E(X + Y) = E(X) + E(Y) = Ξ»β + λᡧ
Since with poisson variables the variances are equal to their corresponding means,
Var(X + Y) = Var(X) + Var(Y) = Ξ»β + λᡧ
Mean and variance of X and Y are equal hence X + Y have a poisson distribution given that X and Y are independent
Note:
Linear combination of independent poisson variables of the for aX + bY can not have a poisson distribution