test2qna Flashcards

1
Q

What is a random variable?

A

A random variable is a function that assigns a real number to each outcome in a sample space.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What are the two types of random variables?

A

Discrete random variables take countable (often finite) values, while continuous random variables take values in an interval (or collection of intervals).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What is the probability mass function (PMF)?

A

For a discrete random variable X, the PMF p(x) gives P(X = x) for each value x.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What is the probability density function (PDF)?

A

For a continuous random variable X, the PDF f(x) satisfies P(a ≤ X ≤ b) = ∫ₐᵇ f(x) dx, with f(x) ≥ 0 and ∫₋∞∞ f(x) dx = 1.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

What is the cumulative distribution function (CDF)?

A

The CDF F(x) = P(X ≤ x) gives the probability that a random variable X takes on a value less than or equal to x.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

What are the properties of the CDF?

A

The CDF is non-decreasing, right-continuous, and satisfies limₓ→₋∞ F(x) = 0 and limₓ→∞ F(x) = 1.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

How is the expected value (mean) defined?

A

For a discrete random variable, E[X] = Σ x · P(X=x); for a continuous variable, E[X] = ∫₋∞∞ x · f(x) dx.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

How is the variance defined?

A

Variance is Var(X) = E[(X – E[X])²] = E[X²] – (E[X])².

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

What is a moment generating function (MGF)?

A

The MGF of X is Mₓ(t) = E[e^(tX)], which, by differentiating with respect to t and evaluating at t = 0, can be used to derive moments (mean, variance, etc.).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

What is the Law of the Unconscious Statistician (LOTUS)?

A

LOTUS states that E[g(X)] = Σ g(x)P(X=x) for discrete variables or E[g(X)] = ∫ g(x) f(x) dx for continuous variables—no need to find the distribution of g(X) first.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

What is a characteristic function?

A

The characteristic function φ_X(t) = E[e^(itX)] uniquely determines the distribution of X and is useful for studying convergence in distribution.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

What is linearity of expectation?

A

For any random variables X and Y and constants a, b, E[aX + bY] = aE[X] + bE[Y] (no independence required).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

What is a Bernoulli random variable?

A

A Bernoulli random variable X takes two values: 0 (failure) with probability 1-p and 1 (success) with probability p, where 0 ≤ p ≤ 1.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

What is the PMF of a Bernoulli random variable?

A

P(X=0) = 1 - p and P(X=1) = p.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

What are the expectation and variance of a Bernoulli random variable?

A

E[X] = p and Var(X) = p(1 - p).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

What is a Binomial random variable?

A

A Binomial random variable X counts the number of successes in n independent Bernoulli trials with success probability p.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

What is the PMF of a Binomial random variable?

A

P(X=i) = C(n, i) · p^i · (1-p)^(n-i), for i = 0, 1, …, n.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

What are the expectation and variance of a Binomial(n, p) distribution?

A

E[X] = np and Var(X) = np(1 - p).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

What is a Poisson random variable?

A

A Poisson random variable X takes values 0, 1, 2, … with PMF: P(X=i) = (e^(–λ) · λ^i) / i! where λ > 0 is the rate parameter.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
20
Q

What are the expectation and variance of a Poisson(λ) distribution?

A

E[X] = λ and Var(X) = λ.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
21
Q

What is an Exponential random variable?

A

An Exponential random variable with parameter λ models waiting times and has PDF: f(x) = λe^(–λx) for x ≥ 0.

22
Q

What are the expectation and variance of an Exponential(λ) distribution?

A

E[X] = 1/λ and Var(X) = 1/λ².

23
Q

What is the memoryless property of the Exponential distribution?

A

P(X > s + t

24
Q

What is the Normal distribution?

A

A normal (or Gaussian) random variable X ~ N(µ, σ²) has PDF: f(x) = (1/√(2πσ²))e^(–(x–µ)²/(2σ²)), defined for all x ∈ ℝ.

25
Q

What are the expectation and variance of a Normal(µ, σ²) distribution?

A

E[X] = µ and Var(X) = σ².

26
Q

How do you standardize a normal variable?

A

For X ~ N(µ, σ²), Z = (X – µ)/σ transforms X to the standard normal distribution, Z ~ N(0, 1).

27
Q

What is the Gamma distribution?

A

A Gamma random variable with parameters α (shape) and θ (scale) has PDF: f(x) = (1/(Γ(α)θ^α)) x^(α–1)e^(–x/θ) for x > 0.

28
Q

What are the expectation and variance of a Gamma(α, θ) distribution?

A

E[X] = αθ and Var(X) = αθ².

29
Q

What is the relationship between the Exponential and Gamma distributions?

A

The exponential distribution is a special case of the Gamma distribution with α = 1 and θ = 1/λ (when the exponential parameter is λ).

30
Q

What is the sum of independent Poisson random variables?

A

If X ~ Poisson(λ₁) and Y ~ Poisson(λ₂) are independent, then X + Y ~ Poisson(λ₁ + λ₂).

31
Q

What is the Central Limit Theorem (CLT)?

A

The CLT states that the sum (or average) of a large number of independent, identically distributed random variables (with finite mean and variance) approximates a normal distribution, regardless of the original distribution.

32
Q

What is the transformation rule for a continuous random variable?

A

If Y = g(X) is a one-to-one transformation with inverse g⁻¹, then the PDF of Y is given by f_Y(y) = f_X(g⁻¹(y)) ·

33
Q

Why is the standard normal distribution important in statistics?

A

The standard normal distribution is the basis for many inferential techniques, such as hypothesis testing and confidence interval estimation, and it arises naturally via the CLT.

34
Q

What defines a continuous random variable, and what two properties must its probability density function (PDF) satisfy?

A

A continuous random variable XX is defined by a nonnegative PDF f(x)f(x), where P(X∈B)=∫Bf(x)dxP(X∈B)=∫B​f(x)dx for any subset B⊆RB⊆R. The PDF must satisfy: (1) f(x)≥0f(x)≥0 for all xx, (2) ∫−∞∞f(x)dx=1∫−∞∞​f(x)dx=1.

35
Q

How do you find the constant CC for the PDF f(x)=C(6x−x2)f(x)=C(6x−x2) on (0,6)(0,6)?

A

Solve ∫06C(6x−x2)dx=1∫06​C(6x−x2)dx=1. Integration yields C[3x2−x33]06=36C=1C[3x2−3x3​]06​=36C=1, so C=136C=361​.

36
Q

For f(x)=136(6x−x2)f(x)=361​(6x−x2), what is the probability X>3X>3?

A

Compute P(X>3)=∫36136(6x−x2)dxP(X>3)=∫36​361​(6x−x2)dx. Result: 136[3x2−x33]36=12361​[3x2−3x3​]36​=21​.

37
Q

Battery lifetime follows f(x)=100x2f(x)=x2100​ for x>100x>100. What’s the probability a battery fails within 150 hours, and how is this used for 2 out of 5 batteries?

A

Probability of failure within 150h: ∫100150100x2dx=13∫100150​x2100​dx=31​. For 2/5 failures: Use the binomial formula: (52)(13)2(23)3≈0.329(25​)(31​)2(32​)3≈0.329.

38
Q

How does scaling a continuous RV XX to Y=5XY=5X affect its PDF?

A

If XX has PDF fX(x)fX​(x), then Y=5XY=5X has PDF fY(y)=15fX(y5)fY​(y)=51​fX​(5y​). This follows from transforming the CDF: FY(y)=FX(y5)FY​(y)=FX​(5y​).

39
Q

Calculate E[X]E[X] for f(x)=3x2f(x)=3x2 on [0,1][0,1].

A

E[X]=∫01x⋅3x2dx=3∫01x3dx=3⋅14=34E[X]=∫01​x⋅3x2dx=3∫01​x3dx=3⋅41​=43​.

40
Q

Derive the variance for f(x)=3x2f(x)=3x2 on [0,1][0,1].

A

First compute E[X2]=∫01x2⋅3x2dx=35E[X2]=∫01​x2⋅3x2dx=53​. Then Var(X)=E[X2]−(E[X])2=35−(34)2=380Var(X)=E[X2]−(E[X])2=53​−(43​)2=803​.

41
Q

If Var(X)=3.5Var(X)=3.5, what is Var(2X−7)Var(2X−7)?

A

Variance scales with a2a2: Var(2X−7)=22⋅Var(X)=4×3.5=14Var(2X−7)=22⋅Var(X)=4×3.5=14. The constant −7−7 does not affect variance.

42
Q

Uniform RV X∼(3,8)X∼(3,8): What is P(X<5)P(X<5)?

A

For uniform (α,β)(α,β), P(a<X<b)=b−aβ−αP(a<X<b)=β−αb−a​. Here, P(3<X<5)=5−38−3=25P(3<X<5)=8−35−3​=52​.

43
Q

What are the expectation and variance formulas for a uniform RV (α,β)(α,β)?

A

Expectation: E[X]=α+β2E[X]=2α+β​. Variance: Var(X)=(β−α)212Var(X)=12(β−α)2​. Example: For (3,8)(3,8), E[X]=5.5E[X]=5.5, Var(X)=2512Var(X)=1225​.

44
Q

Passengers arrive uniformly between 7:00-7:30. What’s the probability of waiting less than 5 minutes for a bus?

A

Buses arrive every 15 minutes. Waiting <5 minutes occurs if arrival is in [10,15)[10,15) or [25,30)[25,30). Probability: 530+530=13305​+305​=31​.

45
Q

Why does E[X1+⋯+Xn]=E[X1]+⋯+E[Xn]E[X1​+⋯+Xn​]=E[X1​]+⋯+E[Xn​], even for dependent variables?

A

Linearity of expectation holds regardless of dependence. Example: For dice rolls, E[sum]=10×3.5=35E[sum]=10×3.5=35. Dependence affects variance, not expectation.

46
Q

How is the expected number of white balls selected in a hypergeometric experiment derived?

A

Use indicator variables: Let Xi=1Xi​=1 if the ii-th white ball is chosen. Then E[X]=∑i=1mE[Xi]=m⋅nNE[X]=∑i=1m​E[Xi​]=m⋅Nn​, since P(Xi=1)=nNP(Xi​=1)=Nn​.

47
Q

In the hat-matching problem, why is the expected number of matches 1?

A

Let Ii=1Ii​=1 if person ii gets their hat. E[Ii]=1NE[Ii​]=N1​, so E[X]=∑i=1N1N=1E[X]=∑i=1N​N1​=1. Linearity simplifies complex dependencies.

48
Q

What does covariance measure, and how is it calculated?

A

Covariance measures joint variability: Cov(X,Y)=E[(X−E[X])(Y−E[Y])]=E[XY]−E[X]E[Y]Cov(X,Y)=E[(X−E[X])(Y−E[Y])]=E[XY]−E[X]E[Y]. If independent, Cov(X,Y)=0Cov(X,Y)=0.

49
Q

How is the variance of a sum of RVs affected by covariance?

A

Var(X1+⋯+Xn)=∑Var(Xi)+2∑i<jCov(Xi,Xj)Var(X1​+⋯+Xn​)=∑Var(Xi​)+2∑i<j​Cov(Xi​,Xj​). For independence, covariance terms vanish, leaving ∑Var(Xi)∑Var(Xi​).

50
Q

If Y=a+bXY=a+bX, why is ρ(X,Y)=1ρ(X,Y)=1 for b>0b>0?

A

Correlation ρ(X,Y)=Cov(X,Y)Var(X)Var(Y)ρ(X,Y)=Var(X)Var(Y)​Cov(X,Y)​. Here, Cov(X,Y)=bVar(X)Cov(X,Y)=bVar(X) and Var(Y)=b2Var(X)Var(Y)=b2Var(X), so ρ=b∥b∥=1ρ=∥b∥b​=1.

51
Q

For 10 fair dice rolls, how is the total variance calculated?

A

Single die variance: Var(Xi)=3512Var(Xi​)=1235​. Total variance: 10×3512=1756≈29.1710×1235​=6175​≈29.17. Independence ensures variances add directly.