Probability Flashcards

1
Q

What does it mean for X_n to converge to X almost surely?

A

P(X_n -> X as n -> infinity) = 1

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What does it mean for X_n to converge to X in probability?

A

For all e>0, P(|X_n - X| < e) -> 1 as n -> infinity

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What does it mean for X_n to converge to X in distribution?

A

For every x such that F is continuous at x, F_n(x) -> F(x) as n -> infinity

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Rank convergence in probability, in distribution and almost surely in order from strongest to weakest

A
  • almost surely
  • in probability
  • in distribution
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Prove that almost sure convergence implies convergence in probability

A
  • Fix e>0
  • Define the event A_N = {|X_n - X| < e for all n >= N}
  • Suppose X_n -> X a.s.
  • P(U A_N) = 1
  • Also, lim N - > infinity P(A_N) = 1
  • A_N implies|X_N - X| < e
  • So P(|X_N - X| < e) -> 1
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Prove that convergence in probability implies convergence in distribution

A
  • Fix x such that F is continuous at x, and fix e>0
  • F_n(x) = P(X_n <= x)
    <= P(X_n <= x+e or |X_n - X| > e)
    <= P(X_n <= x+e) + |X_n - X| > e)
    -> F(x+e) as n -> infinity
  • So F_n(x) < F(x+e) + e for large enough n
  • Similarly, using 1-F_n(x), F_n(x) > F(x-e) - e
  • This implies F_n(x) -> F(x)
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Prove that if X_n -> c, constant, in distribution, then X_n -> c in probability

A
  • lim n->∞ F_Xn(c-e) = 0
  • lim n->∞ F_Xn(c+e/2) = 1

lim n->∞ P(|Xn-X|>=e)
= lim n->∞ P(Xn <= c-e + Xn>=c+e)
= lim n->∞ P(Xn <= c-e) + P(Xn>=c+e)
= 0 + lim n->∞ P(Xn>=c+e)
<= lim n->∞ P(Xn>c+e/2)
= lim n->∞ 1 - F_Xn(c+e/2)
= 1-1
= 0

So lim n->∞ P(|Xn-X|>=e) <= 0
Also know lim n->∞ P(|Xn-X|>=e) >= 0
Hence lim n->∞ P(|Xn-X|>=e) = 0

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

State the weak law of large numbers

A
  • X_1, X_2, … iid with finite mean µ
  • S_n = X_1 + X_2 + … + X_n
  • Then S_n / n converges in probability to µ as n -> infinity
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Prove the weak law of large numbers

A

P(|Sn/n-µ|>=e)
<= Var(Sn / n)/e^2 (Chebyshevs)
= Var(X1)/ne^2 (Var(Sn/n) = σ^2/n)
-> 0 as n -> infinity

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

State Markov’s inequality

A
  • X random variable taking non-negative values
  • Then P(X >= z) <= E[X] / z
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Prove Markov’s inequality

A
  • Let X_z = z I{X ≥ z}.
  • So X_z takes the value 0 whenever
    X is in [0, z) and the value z whenever X is in [z, ∞).
  • So X ≥ X_z always
  • Then E[X] ≥ E[X_z] = z E[I{X ≥ z}] = z P(X ≥ z).
  • Rearrange
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

State Chebyshev’s inequality

A
  • Y random variable with finite mean and variance
  • Then for any e>0,
    P(|Y - E[Y]| >= e) <= var(Y) / e^2
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Prove Chebyshev’s inequality

A

P(|Y - E[Y]| >= e) = P((Y - E[Y])^2 >= e^2)
(by Markov’s) <= E[(Y - E[Y])^2] / e^2
= Var(Y) / e^2

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

State the Strong Law of Large Numbers

A
  • X_1, X_2, … iid with mean µ
  • S_n = X_1 + X_2 + …
  • Then S_n / n converges almost surely to µ as n -> infinity
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

What are the differences between the WLLN and the SLLN?

A
  • WLLN requires finite mean, SLLN doesn’t
  • WLLN gives convergence in probability, SLLN gives convergence almost surely
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

State the Central Limit Theorem

A
  • X_1, X_2, … iid with mean µ and variance σ^2 > 0
  • S_n = X_1 + X_2 + … + X_n
  • Then (S_n - nµ) / σsqrt(n) converges in distribution to N(0,1)
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

State the uniqueness theorem for PGFs

A

If X and Y have the same PGF, then they have the same distribution.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

State the convergence theorem for PGFS

A

G_Xn (s) → G_X (s) as n → ∞, for all s ∈ [0, 1], if and only if p_Xn (k) → p_X (k), as n → ∞, for all k = 0, 1, 2, . . .

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

State the uniqueness theorem for MGFs

A

If X and Y are random variables with the same moment generating function,
which is finite on [−t0, t0] for some t0 > 0, then X and Y have the same distribution.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
20
Q

State the continuity theorem for MGFs

A
  • MY and MX1, MX2, . . . are all finite on [−t0, t0] for some t0 > 0.
  • If MXn (t) → MY (t) as n → ∞, for all t ∈ [−t0, t0],
  • Then Xn → Y in distribution as n → ∞.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
21
Q

State the change of variables formula for changing from X,Y to U,V

A

-Let D, R ⊂ R^2
-Let T : D → R be a bijection with continuously differentiable inverse
S : R → D.
-If (X, Y) is a D-valued pair of random variables with joint density function f_X,Y

Then (U, V) = T(X, Y) is jointly continuous with joint density function:

f_U,V (u, v) = f_X,Y (S(u, v))|J(u, v)|, where J(u, v) = x_u y_v - x_v y_u

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
22
Q

If f_X,Y is the joint density of X and Y, what is the conditional distribution of X given Y?

A

f_X,Y / f_Y

23
Q

If X_1, … X_n are independent, with MGFs M_X1, … M_Xn, what is the mgf of X_1+…+X_n?

A

The product of the MGFs

24
Q

What is the pdf of the exponential distribution?

A

λe^(-x) x>=0

25
Q

What is the pdf of the gamma distribution?

A

λ^r/gamma(λ) x^(r-1) e^(-λx)

26
Q

What is the pdf of the normal distribution?

A

(2πσ^2)^-1/2 * exp[(-1/2σ^2) * (x − µ)^2]

27
Q

What is the pmf of the poisson distribution?

A

E^(-λ) λ^x / x! x = 0,1,…

28
Q

What is a transition matrix P?

A

All entries non-negative, each row sums to 1.

29
Q

How would you find the pdf of X+Y given their joint pdf f_X,Y ?

A
  • Change variables to U=X+Y, V=X
  • Has jacobian 1
  • So fU,V (u, v) = fX,Y (v, u − v)
  • Then integrate over v to get the marginal distribution of U=X+Y
30
Q

What is a Markov chain?

A

Let X = (X0, X1, X2, . . .) be a sequence of random variables taking values in I.

For any n ≥ 0 and any i0, i1, . . . , in+1 ∈ I,
P(Xn+1 = i_n+1 | Xn = in, . . . , X0 = i0)
= P(Xn+1 = i_n+1 | Xn = in)

31
Q

What does it mean for a markov chain to be time-homogeneous?

A

p_ij = P(Xn+1 = j | Xn = i)
depends only on i and j, not on n

32
Q

What are the transition probabilities of a markov chain?

A

p_ij = P(Xn+1 = j | Xn = i)

33
Q

What is the markov property?

A

The future is independent of the past, given the present

34
Q

What is the n-step transition probability of a markov chain?

A

p^(n)ij = P(Xr+n = j | Xr = i)

35
Q

If λ is the distribution of X_0, what is the distribution of X_n?

A

λ P^n

36
Q

What is a communicating class?

A

The partitions obtained from the equivalence relation “i communicates with j” (i.e. p_ij^(n) > 0 and p_ji^(n) > 0)

37
Q

What does it mean for a communicating class C to be closed?

A

pij = 0 whenever i is in C and j is not in C

38
Q

What does it mean for a communicating class to be open?

A

Not closed

39
Q

What is an absorbing state?

A

A closed class consisting of a single element (i.e. p_ii = 1)

40
Q

What does it mean for a markov chain to be irreducible?

A

All entries of P are non-zero (i.e. one communicating class)

41
Q

What is the period of a state i?

A

the GCD of {n : p_ij^(n) > 0}

42
Q

What does it mean for a state to be periodic?

A

period > 1

43
Q

What does it mean for a state to be aperiodic?

A

period = 1

44
Q

What is the hitting probability of A starting from state i?

A

h_i ^ A= Pi (Xn ∈ A for some n ≥ 0)

45
Q

What does it mean for a state to be transient?

A

Pi(hit i infinitely often) = 0.

46
Q

What does it mean for a state to be recurrent?

A

Pi(hit i infinitely often) = 1

47
Q

What is the mean return time to a state i?

A

mi
: = E i (min{n ≥ 1 : Xn = i})

48
Q

What does it mean for i to be null recurrent?

A

i is recurrent but mi = ∞

49
Q

What does it mean for i to be positive recurrent?

A

mi < ∞

50
Q

What does it mean for π to be a stationary distribution of a markov chain?

A

π = πP

51
Q

When does P have a stationary distribution?

A

When P is positive recurrent

52
Q

What is the stationary distribution?

A

π_i = 1/m_i

53
Q

What is the convergence to equilibrium theorem?

A
  • P irreducible and aperiodic, stationary dist. π
  • Then P(Xn = j) → πj as n → ∞.
54
Q
A