Chapter 6: Convergence of random variable Flashcards

1
Q

Modes of convergence

On formula sheet

A

(X_n), X RVs

1) X_n arrow(d) X “convergence in distribution”
For every x in R st P(X=x) =0, we have P(X_n ≤ x) tend to P(X ≤ x) as n →∞

2)X_n arrow(P) X “convergence in probability”
Given any a bigger than 0, limit as n →∞ of P[ |X_n -X| bigger than a ] =0

3)X_n arrow(a.s) X “almost sure convergence”
P[X_n → X as n →∞] =1
Ie P[ ω in Ω: X_n(ω) → X(ω) as n →∞]

4) X_n arrow(p) “convergence in L^p”
E[ |X_n -X|^p] → 0 as n →∞

3 and 4 are strict, 1 and 2 are weaker

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Example 6.1.1

Let U be a uniform RV in [0,1]
U~unif[0,1] and set X_n = n^2 1_{U less than 1/n} =
{n^2 if U less than 1/n
{0 o/w

CONVERGENCE IN DISTRIBUTION

A

Thus X_n only takes 2 possible values: n^2 ( if n is large this is unlikely) or 0

Therefore candidate limit X=0

1) convergence in distribution:
P(X_n≤ x) ={…

(As P(X ≤ x) =
{1 if x ≥ 0
{0 if x is less than 0 )

So comidiese 2 cases:
* if x is less than 0 P(X_n ≤ x ) =0 implies 0 = P(X ≤ x)

  • if x ≥ 0 then P(X_n =0) = 1 - (1/n)

( 2nd implies 1/n so chance of uniform RV 1- 1/n)

1- (1/n) = P( X_n =0) ≤ P(X_n ≤ x) ≤ 1

Let n tend to infinity, by sandwich tile P(X_n≤ x) → 1= P( X ≤ x)

So in all cases, P(X_n ≤ x) → P(X ≤ x),
So X_n → X

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Example 6.1.1

Let U be a uniform RV in [0,1]
U~unif[0,1] and set X_n = n^2 1_{U less than 1/n} =
{n^2 if U less than 1/n
{0 o/w

CONVERGENCE IN PROBABILITY

A

2) for 0 less than a ≤ n^2 * (which is true eventually for all large n)

P( |X_n -0| bigger than a)
= P(|X_n| bigger than a)
≤ P(X_n =n^2) =1/n
(By uniform dist)

So as n →∞
P( |X_n -0| bigger than a) →0 so X_n arrow(P) 0

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Example 6.1.1

Let U be a uniform RV in [0,1]
U~unif[0,1] and set X_n = n^2 1_{U less than 1/n} =
{n^2 if U less than 1/n
{0 o/w

ALMOST SURE CONVERGENCE

A

If X_m =0 for some m in N, then X_n =0 for all n ≥ m

(If X_m =0 then u ≥ (1/m) implies that u ≥ (1/n)
Which implies X_n =0 for n ≥ m)

As X_n is at start value n^2 but as soon as u ≥ (1/m) we have X_n→ 0 and continues at 0 ( once hits 0 stays)

Diagram showing exponential sample but then hits 0 once and stays

So 1 ≥ P( limit as n →∞ of X_n =0)
≥ P(X_m=0) = 1 - (1/m) → 1

If RHS occurs certain that LHS occurs

So by sandwich rule
P(limit as n →∞ of X_n =0)=1
(limit of X_n is number not sequence)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Example 6.1.1

Let U be a uniform RV in [0,1]
U~unif[0,1] and set X_n = n^2 1_{U less than 1/n} =
{n^2 if U less than 1/n
{0 o/w

CONVERGENCE IN L^p

A

ie L^1

E(|X_n-0|) = E( |X_n|) by dist
= 0(P(|X_n|=0) + (n^2P(|X_n|=n^2)
(but X_n ≥ 0 for all n so)

= 0P(X_n=0) +n^2P(X_n=n^2)
=n^2P(X_n=n^2)
=n^2(1/n) ↛0
(doesn't converge to 0)
So X_m ↛^1 X

ie doesn’t converge in L^1

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

LEMMA 6.1.2 relationships between types of convergence

A

Lemma 6.1.2 Let Xn,X be random variables.

  1. If Xn P → X then Xn d → X.
  2. If Xn a.s. → X then Xn P → X.
  3. If Xn Lp → X then Xn P → X.
  4. Let 1 ≤ p < q. If Xn Lq → X then Xn Lp → X.

In all other cases (i.e. that are not automatically implied by the above), convergence in one mode does not imply convergence in another

DIAGRAM:

q more than p

L^q →L^p →P→d
(a.s →P )

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

NOTES: relationships between types of convergence

A

Remark 6.1.3 For convergence of real numbers, it was shown in MAS221 that if an → a and an → b then a = b, which is known as uniqueness of limits. For random variables, the situation is a little more complicated: if Xn P → X and Xn P → Y then X = Y almost surely. By Lemma 6.1.2, this result also applies to Lp → and a.s. →. However, if we have only Xn d → X and Xn d → Y then we can only conclude that X and Y have the same distribution function. Proving these facts is one of the challenge exercises, 6.9

uniqueness of limits doesn’t apply ie X not equal to Y and X_n →P X and X_n →P Y but P[X=Y]=1 and the distributions are the same if X_n →d X and X_n →d Y imply F_X=F_Y

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Theorem 6.2.1 (Monotone Convergence Theorem

A

Let (Xn) be a sequence of random variables and suppose that:

  1. Xn+1 ≥ Xn, almost surely, for all n.
  2. Xn ≥ 0, almost surely, for all n.

Then, there exists a random variable X such that Xn a.s. → X.
Further, E[X_n] →E[X]

(note its possible that P(X= infinity) bigger than 0 and E(X) = infinity, as long as positive values we don’t have “infinity-infinity”)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

EXAMPLE:

A

Let (Xn) be a sequence of independent random variables, with distribution given by
X_i =
{2^{−i} with probability 1/2
{0 with probability 1/2.

Let Yn =sum from I=1 to n of [X_i]

Y_{n+1}=Y_n + X_{n+1}
Y_{n+1}≥Y_n (1)
Y_1= X_1 ≥ 0 implies Y_n ≥ 0 (2)
For all n

Then (Y_n) is an increasing sequence, almost surely, and hence converges almost surely to the limit

By the MCT, (1) and (2) give:
there exists a RV Y st E(Y_n) tends to E(Y)

E(Y_n)= E(sum from I=1 to n of [X_i])
= sum from i=1 to n of [(0.5*2^{-i} + 0.5(0)]
= sum from i=1 to n of 2^-(i+1)

=0.5(0.5 - (0.5)^{n+1})/(1-0.5)

Since also Yn ≥ 0, we can apply the monotone convergence theorem to (Yn) and deduce that E[Yn] →E[Y ]. By linearity of E, and geometric summation, we have that E(Y_n) converges to 1/2 as n → ∞,

so we deduce that E[Y ] = 1/2.

We’ll investigate this example further in exercise 6.5. In fact, Xi corresponds to the ith digit in the binary expansion of Y .

example:

Values of Y_1 • , Y_2 ○and Y_3 ◘ shown on number line
•——–•0.5——|1
○ ○ ○ ○
◘ ◘ ◘ ◘ ◘ ◘ ◘

define Z_i=
{1 if X_i = 2^{-i}
{0 if X_i = 0
then 0,Z_1,Z_2,.. is the binary expansion of Y st Y has unif(0,1) dist

How well did you know this?
1
Not at all
2
3
4
5
Perfectly