Topic 5 Multiple Variables Flashcards

1
Q

What is the purpose of a joint distribution in a bivariate analysis?

A

To examine the relationship and combined behavior of two variables.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

A joint distribution helps answer:
a) Whether variables exist
b) If E[X] = E[Y]
c) To what extent two variables are related
d) Which variable is more important

A

c) To what extent two variables are related

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What is the formula for covariance?

A

Cov(X, Y) = E[(X - E[X])(Y - E[Y])] = E[XY] - E[X]E[Y]

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

True/False
If X = Y, then Cov(X, Y) = Var(X)

A

True

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

What does a positive covariance indicate?

A

That X and Y tend to vary in the same direction.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

If Cov(X, Y) < 0, then:
a) X and Y increase together
b) X and Y are independent
c) X is likely less than E[X] when Y is greater than E[Y]
d) X and Y are always equal

A

c) X is likely less than E[X] when Y is greater than E[Y]

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

What is Cov(cX, Y)?

A

c · Cov(X, Y)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

True/False
Cov(X, c) = 0 for any constant c

A

True

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Cov(X + Y, Z) = Cov(X, Z) + _________.

A

Cov(Y, Z)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

What does a covariance matrix represent?

A

It represents the variance and pairwise covariances among multiple random variables.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

What is the formula for the correlation coefficient?

A

ρ(X, Y) = Cov(X, Y) / (σ_X * σ_Y)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

True/False
If ρ(X, Y) = −1, then Y is a decreasing linear function of X.

A

True

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

If X and Y are uncorrelated, what is true about their variances?
a) They are equal
b) Var(X + Y) = Var(X) + Var(Y)
c) Cov(X, Y) = 1
d) E[X + Y] = 0

A

b) Var(X + Y) = Var(X) + Var(Y)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

If X₁, X₂, …, Xₙ are pairwise uncorrelated, then Var(X₁ + … + Xₙ) = ____________.

A

Var(X₁) + Var(X₂) + … + Var(Xₙ)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

True/False
Corr(X, Y) = 0 always implies X and Y are independent.

A

False

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

What is Pearson’s correlation coefficient?

A

A measure of linear dependence between two continuous variables.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

A correlation coefficient of 0.975 indicates that the two variables are ________.

A

highly correlated

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

What are spurious correlations?

A

Correlations between two variables that are statistically related but not causally linked.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

True/False
High correlation always means one variable causes the other.

A

False — correlation ≠ causation.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
20
Q

The joint PMF for discrete variables X and Y is defined as P_XY(j, k) = _________.

A

P(X = j and Y = k)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
21
Q

What is R_XY in joint distributions?
a) Range of X only
b) All possible pairs (j, k) where PXY(j, k) > 0
c) Marginal of Y
d) Independent distribution

A

b) All possible pairs (j, k) where PXY(j, k) > 0

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
22
Q

What is the formula for joint CDF of discrete variables X and Y?

A

F_XY(j, k) = P(X ≤ j, Y ≤ k)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
23
Q

True/False
Joint CDF can be expressed as an intersection of events: P((X ≤ j) ∩ (Y ≤ k))

A

True

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
24
Q

What is the formula for the joint PDF for continuous variables X and Y?

A

f_XY(x, y) = P(x ≤ X ≤ x + dx, y ≤ Y ≤ y + dy) / (dx * dy)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
25
Q

True/False
For all x and y, fXY(x, y) must be greater than or equal to 0.

A

True

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
26
Q

How do you obtain the joint PDF from the joint CDF?

A

Find the double partial derivative of the joint CDF

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
27
Q

How to find joint CDF of X and Y from joint PDF?

A

Double integral using dummy variables

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
28
Q

The joint cumulative distribution function satisfies 0 ≤ FXY(x, y) ≤ ___.

A

1

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
29
Q

Which of the following is true?
a) FXY(−∞, y) = 1
b) FXY(x, −∞) = 1
c) FXY(∞, ∞) = 1
d) FXY(−∞, −∞) = 1

A

c) FXY(∞, ∞) = 1

30
Q

In joint distribution graphs, what does the shaded area under the curve represent?

A

The probability that the point (X, Y) falls within the shaded region.

31
Q

What is a marginal distribution?

A

The distribution of one variable considered separately, ignoring any dependency on the other.

32
Q

True/False
Marginal distributions take into account the relationship between two variables.

33
Q

What is the marginal PMF of X from a joint PMF PXY(j, k)?

A

P_X(j) = ∑ P_XY(j, k) over k

34
Q

What is the marginal PDF of X from a joint PDF fXY(x, y)?

A

f_X(x) = ∫ from -∞ to +∞ of f_XY(x, y) dy

35
Q

X and Y are independent if fXY(x, y) = _________.

A

fX(x) · fY(y)

36
Q

Which condition confirms independence for continuous variables?
a) Cov(X, Y) = 0
b) FXY(x, y) = FX(x) · FY(y)
c) ρ(X, Y) = 1
d) Var(X + Y) = 0

A

b) FXY(x, y) = FX(x) · FY(y)

37
Q

What is P(A|B) for events A and B with P(B) > 0?

A

P(A | B) = P(A ∩ B) / P(B)

38
Q

The conditional CDF of X given event A is denoted by _________.

39
Q

Conditional = Joint divided by __________.

40
Q

True/False
If X and Y are independent, then fX|Y(x|y) = fX(x).

41
Q

To check independence of X and Y, verify whether PXY(j,k) = PX(j) · ________.

42
Q

True/False
If two variables are uncorrelated, they are always independent.

43
Q

What is an example of uncorrelated variables that are not independent?

A

Variables with joint distribution where E[XY] = E[X]E[Y], but joint PDF ≠ product of marginals.

44
Q

What is the formula for conditional expectation E[X | Y = y]?

A

E[X | Y = y] = ∑ xᵢ · P_X|Y(xᵢ | y)

45
Q

The law of total probability says P(A) = ∑ P(A|Bi) · ________.

46
Q

How is the marginal PMF PX(x) expressed using conditional probability?

A

P_X(x) = ∑ P_X|Y(x | yⱼ) · P_Y(yⱼ)

47
Q

What is the Law of Total Expectation for discrete Y?

A

E[X] = ∑ E[X | Y = yⱼ] · P_Y(yⱼ)

48
Q

What does the Law of Total Expectation express?

A

That the overall expectation of X can be computed by weighting conditional expectations with their probabilities.

49
Q

What is the Law of the Unconscious Statistician (LOTUS) for two discrete variables?

A

E[g(X, Y)] = ∑ g(xᵢ, yⱼ) · P_XY(xᵢ, yⱼ) over (xᵢ, yⱼ)

50
Q

What is the Law of Iterated Expectations?

A

E[X] = E[E[X | Y]]

51
Q

What is the Law of Total Variance?

A

Var(X) = E[Var(X | Y)] + Var(E[X | Y])

52
Q

The conditional variance Var(X | Y = y) is equal to E[X² | Y = y] minus ________.

A

(E[X | Y = y])²

52
Q

If X and Y are independent, what is E[XY]?

A

E[XY] = E[X] · E[Y]

53
Q

If N is a random variable counting independent Xi’s, then E[∑Xi] = ________.

A

E[X] · E[N]

54
Q

Conditional expectation for continuous variables is:
E[X | Y = y] = ∫ x · ________ dx

A

fX|Y(x | y)

55
Q

What is the formula for transformed joint PDF using Jacobian?

A

f_ZW(z, w) = f_XY(h₁(z, w), h₂(z, w)) · |J|

56
Q

What does the Jacobian determinant measure in variable transformation?

A

How areas (or volumes) are scaled under the transformation.

57
Q

If Z = X + Y, then what is fZ(z)?

A

f_Z(z) = ∫ from -∞ to ∞ of f_XY(w, z − w) dw

58
Q

If X and Y are independent, fZ(z) is the ________ of fX and fY.

A

convolution

59
Q

What is the formula for Var(aX + bY)?

A

a²Var(X) + b²Var(Y) + 2abCov(X, Y)

60
Q

True/False
The sum of two independent Poisson variables is also Poisson-distributed.

61
Q

If I ∼ Pois(λ), J ∼ Pois(μ), then I + J ∼ _________.

A

Pois(λ + μ)

62
Q

As Poisson parameters increase, what happens to the PMF?
a) It flattens
b) It becomes more spread
c) It shifts right
d) It becomes a delta function

A

c) It shifts right

63
Q

If X ∼ Gamma(a₁, β) and Y ∼ Gamma(a₂, β), what is X + Y?

A

Gamma(a₁ + a₂, β)

64
Q

If X ∼ N(μ₁, σ₁²), Y ∼ N(μ₂, σ₂²), then X + Y ∼ ?

A

N(μ₁+ μ₂, σ₁² + σ₂²)

65
Q

True/False
The PDF of the sum of two Normal variables is also a Normal distribution.

66
Q

What is the condition for two variables X and Y to be jointly normal?

A

If aX + bY is normally distributed for all real a, b.

67
Q

True/False
If X and Y are independent normal variables, they are jointly normal.

68
Q

What is the PDF of a standard bivariate normal distribution with correlation ρ?

A

f_XY(x, y) = (1 / (2π√(1 − ρ²))) · exp[−(1 / (2(1 − ρ²))) · (x² − 2ρxy + y²)]

69
Q

What does Theorem 3 state about uncorrelated bivariate normal variables?

A

f X and Y are bivariate normal and uncorrelated, they are independent

70
Q

What is E[Y | X = x] for bivariate normal variables?

A

E[Y | X = x] = μ_Y + ρ(σ_Y / σ_X)(x − μ_X)

71
Q

What is Var(Y | X = x) for bivariate normal variables?

A

(1 − ρ²)σ_Y²