Topic 5 Multiple Variables Flashcards
What is the purpose of a joint distribution in a bivariate analysis?
To examine the relationship and combined behavior of two variables.
A joint distribution helps answer:
a) Whether variables exist
b) If E[X] = E[Y]
c) To what extent two variables are related
d) Which variable is more important
c) To what extent two variables are related
What is the formula for covariance?
Cov(X, Y) = E[(X - E[X])(Y - E[Y])] = E[XY] - E[X]E[Y]
True/False
If X = Y, then Cov(X, Y) = Var(X)
True
What does a positive covariance indicate?
That X and Y tend to vary in the same direction.
If Cov(X, Y) < 0, then:
a) X and Y increase together
b) X and Y are independent
c) X is likely less than E[X] when Y is greater than E[Y]
d) X and Y are always equal
c) X is likely less than E[X] when Y is greater than E[Y]
What is Cov(cX, Y)?
c · Cov(X, Y)
True/False
Cov(X, c) = 0 for any constant c
True
Cov(X + Y, Z) = Cov(X, Z) + _________.
Cov(Y, Z)
What does a covariance matrix represent?
It represents the variance and pairwise covariances among multiple random variables.
What is the formula for the correlation coefficient?
ρ(X, Y) = Cov(X, Y) / (σ_X * σ_Y)
True/False
If ρ(X, Y) = −1, then Y is a decreasing linear function of X.
True
If X and Y are uncorrelated, what is true about their variances?
a) They are equal
b) Var(X + Y) = Var(X) + Var(Y)
c) Cov(X, Y) = 1
d) E[X + Y] = 0
b) Var(X + Y) = Var(X) + Var(Y)
If X₁, X₂, …, Xₙ are pairwise uncorrelated, then Var(X₁ + … + Xₙ) = ____________.
Var(X₁) + Var(X₂) + … + Var(Xₙ)
True/False
Corr(X, Y) = 0 always implies X and Y are independent.
False
What is Pearson’s correlation coefficient?
A measure of linear dependence between two continuous variables.
A correlation coefficient of 0.975 indicates that the two variables are ________.
highly correlated
What are spurious correlations?
Correlations between two variables that are statistically related but not causally linked.
True/False
High correlation always means one variable causes the other.
False — correlation ≠ causation.
The joint PMF for discrete variables X and Y is defined as P_XY(j, k) = _________.
P(X = j and Y = k)
What is R_XY in joint distributions?
a) Range of X only
b) All possible pairs (j, k) where PXY(j, k) > 0
c) Marginal of Y
d) Independent distribution
b) All possible pairs (j, k) where PXY(j, k) > 0
What is the formula for joint CDF of discrete variables X and Y?
F_XY(j, k) = P(X ≤ j, Y ≤ k)
True/False
Joint CDF can be expressed as an intersection of events: P((X ≤ j) ∩ (Y ≤ k))
True
What is the formula for the joint PDF for continuous variables X and Y?
f_XY(x, y) = P(x ≤ X ≤ x + dx, y ≤ Y ≤ y + dy) / (dx * dy)
True/False
For all x and y, fXY(x, y) must be greater than or equal to 0.
True
How do you obtain the joint PDF from the joint CDF?
Find the double partial derivative of the joint CDF
How to find joint CDF of X and Y from joint PDF?
Double integral using dummy variables
The joint cumulative distribution function satisfies 0 ≤ FXY(x, y) ≤ ___.
1
Which of the following is true?
a) FXY(−∞, y) = 1
b) FXY(x, −∞) = 1
c) FXY(∞, ∞) = 1
d) FXY(−∞, −∞) = 1
c) FXY(∞, ∞) = 1
In joint distribution graphs, what does the shaded area under the curve represent?
The probability that the point (X, Y) falls within the shaded region.
What is a marginal distribution?
The distribution of one variable considered separately, ignoring any dependency on the other.
True/False
Marginal distributions take into account the relationship between two variables.
False
What is the marginal PMF of X from a joint PMF PXY(j, k)?
P_X(j) = ∑ P_XY(j, k) over k
What is the marginal PDF of X from a joint PDF fXY(x, y)?
f_X(x) = ∫ from -∞ to +∞ of f_XY(x, y) dy
X and Y are independent if fXY(x, y) = _________.
fX(x) · fY(y)
Which condition confirms independence for continuous variables?
a) Cov(X, Y) = 0
b) FXY(x, y) = FX(x) · FY(y)
c) ρ(X, Y) = 1
d) Var(X + Y) = 0
b) FXY(x, y) = FX(x) · FY(y)
What is P(A|B) for events A and B with P(B) > 0?
P(A | B) = P(A ∩ B) / P(B)
The conditional CDF of X given event A is denoted by _________.
FX|A(x)
Conditional = Joint divided by __________.
Marginal
True/False
If X and Y are independent, then fX|Y(x|y) = fX(x).
True
To check independence of X and Y, verify whether PXY(j,k) = PX(j) · ________.
PY(k)
True/False
If two variables are uncorrelated, they are always independent.
False
What is an example of uncorrelated variables that are not independent?
Variables with joint distribution where E[XY] = E[X]E[Y], but joint PDF ≠ product of marginals.
What is the formula for conditional expectation E[X | Y = y]?
E[X | Y = y] = ∑ xᵢ · P_X|Y(xᵢ | y)
The law of total probability says P(A) = ∑ P(A|Bi) · ________.
P(Bi)
How is the marginal PMF PX(x) expressed using conditional probability?
P_X(x) = ∑ P_X|Y(x | yⱼ) · P_Y(yⱼ)
What is the Law of Total Expectation for discrete Y?
E[X] = ∑ E[X | Y = yⱼ] · P_Y(yⱼ)
What does the Law of Total Expectation express?
That the overall expectation of X can be computed by weighting conditional expectations with their probabilities.
What is the Law of the Unconscious Statistician (LOTUS) for two discrete variables?
E[g(X, Y)] = ∑ g(xᵢ, yⱼ) · P_XY(xᵢ, yⱼ) over (xᵢ, yⱼ)
What is the Law of Iterated Expectations?
E[X] = E[E[X | Y]]
What is the Law of Total Variance?
Var(X) = E[Var(X | Y)] + Var(E[X | Y])
The conditional variance Var(X | Y = y) is equal to E[X² | Y = y] minus ________.
(E[X | Y = y])²
If X and Y are independent, what is E[XY]?
E[XY] = E[X] · E[Y]
If N is a random variable counting independent Xi’s, then E[∑Xi] = ________.
E[X] · E[N]
Conditional expectation for continuous variables is:
E[X | Y = y] = ∫ x · ________ dx
fX|Y(x | y)
What is the formula for transformed joint PDF using Jacobian?
f_ZW(z, w) = f_XY(h₁(z, w), h₂(z, w)) · |J|
What does the Jacobian determinant measure in variable transformation?
How areas (or volumes) are scaled under the transformation.
If Z = X + Y, then what is fZ(z)?
f_Z(z) = ∫ from -∞ to ∞ of f_XY(w, z − w) dw
If X and Y are independent, fZ(z) is the ________ of fX and fY.
convolution
What is the formula for Var(aX + bY)?
a²Var(X) + b²Var(Y) + 2abCov(X, Y)
True/False
The sum of two independent Poisson variables is also Poisson-distributed.
True
If I ∼ Pois(λ), J ∼ Pois(μ), then I + J ∼ _________.
Pois(λ + μ)
As Poisson parameters increase, what happens to the PMF?
a) It flattens
b) It becomes more spread
c) It shifts right
d) It becomes a delta function
c) It shifts right
If X ∼ Gamma(a₁, β) and Y ∼ Gamma(a₂, β), what is X + Y?
Gamma(a₁ + a₂, β)
If X ∼ N(μ₁, σ₁²), Y ∼ N(μ₂, σ₂²), then X + Y ∼ ?
N(μ₁+ μ₂, σ₁² + σ₂²)
True/False
The PDF of the sum of two Normal variables is also a Normal distribution.
True
What is the condition for two variables X and Y to be jointly normal?
If aX + bY is normally distributed for all real a, b.
True/False
If X and Y are independent normal variables, they are jointly normal.
True
What is the PDF of a standard bivariate normal distribution with correlation ρ?
f_XY(x, y) = (1 / (2π√(1 − ρ²))) · exp[−(1 / (2(1 − ρ²))) · (x² − 2ρxy + y²)]
What does Theorem 3 state about uncorrelated bivariate normal variables?
f X and Y are bivariate normal and uncorrelated, they are independent
What is E[Y | X = x] for bivariate normal variables?
E[Y | X = x] = μ_Y + ρ(σ_Y / σ_X)(x − μ_X)
What is Var(Y | X = x) for bivariate normal variables?
(1 − ρ²)σ_Y²