Chapter 2: Probability Spaces And Random Variables Flashcards

1
Q

Sample space

A

Ω

Set containing every possible outcome

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Event

A

Collection of possible outcomes

Subset of Ω

Subset of sample space

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Definition 2.1.1

F, set of subsets

Sigma field

A

Let F be a set of subsets of Ω. We say F is a sigma-field if:

1) empty and sample space are elements
2) complements are elements
3) unions are elements

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

P(Ω)

A

The power set F= P( Ω )

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Sub-sigma-field

A

g is subset of F

g is a sub-sigma-field of F

Ie a more restrictive set of information than F has

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Measurable subset

A

A subset A of Ω is measurable/ a measurable event if A is element of F

Ie A is F-measureable

Ie A is an event

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Probability space

A

(Ω, F, P)

Sample space
Sigma field
Probability measure

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Empty set

A

Test nothing happens

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Examples of sigma fields for chosen experiments

Coin toss

A

H or T

Ω = {H,T}

F= { ∅, {H}, {T} , Ω}

Set of all possible subsets

In order

It’s a test we get nothing ( this has chance 0)
A test we get a Head
A test we get a tail
A test we get a head or tail

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Examples of sigma fields for chosen experiments

Coin toss with 2 coins

A

Ω = { HH, HT, TH,TT}

How we F (set of subset) only contains events we are interested in..
Suppose that all we care about is whether we get two heads

So define F as set of subsets we care about

For example if interested in whether both coins are heads we have
F = { ∅, {H,H}, Ω{H,H}, Ω}

Ie complement of HH is Ω\HH “not HH”

And empty and sample space for sigma field

Test nothing (chance0)
Test HH
Test not HH
Test two coins

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Probability measure

A

P is a function P:F to [0,1] st

1) P[Ω] =1
2) if A_1, A_2,… in F are pair-wise disjoint

(ie A_i intersection A_j = empty set for all I,j st I not equal to j)

Then
P[ Union from I=1 to infinity ] = sigma from I=1 to Infinity of P[A_i] *

*the probability of the Union is the probability that any one of the events happens

  • uses the probability of A1 Union A_2 that are disjoint is the sum of the individual probabilities
    P[A_1 Union A_2 ] = P[A_1] + P[A_2]
  • this last condition is sigma additivity
    (We have sigma-additivity and * by def 2.1.1 of sigma field)
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Probability space

A

Triple ( Ω, F, P)
Ω sample space
Where F is a sigma-field
And P is a probability measure

Examples of F:
* F= {∅, Ω} this is a no information sigma field as we have Ω the event that anything happens (probability 1) and ∅ the event that nothing happens ( always probability 0)

  • we care about one event eg if outcome is in a subset A so we use sigma field:
    F= {∅, A, Ω\A, Ω}

These F are sigma fields- we can check the conditions

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Single coin toss probability measure

A

Ω ={H,T}

F= { ∅, {H}, {T}, Ω}

And define P[{H}] | = P [{T}] = 0.5 if it’s a fair coin toss ie probability of a set containing H = probability of the set containing T

And define P(Ω) =1 and P(∅) =0

We want to choose F to be smaller than P(Ω)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Lemma 2.1.5 intersections of sigma fields

A

Let I be any set. For any i in I let F_i be a sigma-field. Then F = intersections of i in I of F_i

Is a σ-field

Ie we contain all the information that is common to all
Proof: conditions

1) F_i is a sigma field so empty set is an element of F_i and so empty set is an element of intersections F_i
2) if A is an element of F = intersections of F_i then A is an element of F_i for each i. Since each F_i is a sigma-field then Ω\A is an element of F_i for each i. Hence Ω\A is an element of the intersections of F_i

3)
If A_j in F for all j then A_j is in F_i for all i and j. Since each F_i is a sigma-field, unión of A_j is an element of F_i for all i. Hence union of A_j is an element of the intersections of F_i

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Corollary 2.1.6

Intersections of sigma-fields σ-fields

A

If F_1 and F_2 are sigma-fields then so is F_1 intersection F_2

Simple case of lemma

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Def 2.1.7

Events in sigma fields

A

Let E_1, E_2,.. be subsets of Ω.
σ(E_1, E_2,..) is the smallest σ -fields containing E_1, E_2,…

(Any event in any of F_i)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

Defined Big curly F

How do we construct a sigma field without writing it down

A

For a given Ω (sample space), finite or countable E_1, E_2,.. subset of Ω (events). ( we are interested in)

Let big_curly_F be the set of all sigma-fields that contain E_1,E_2,..

Enumerate
Big_curly_F ={ F_i: i is in I}

And by applying lemme 2.1.5 obtain a sigma-field F which contains all events we want

(Each of these F_i contains events we want And maybe some others)
This means F is the smallest σ-field that has E_1,E_2,.. as events

(Applying lemma 2.1.5)
(F is contained inside any σ-field F which has those events we want

And
It’s the smallest σ -field which contains all the E_i’s
As it’s the intersection of all that contain and thus big_curly_F is smaller than all of F_i ‘s

By the lemma

Ie it EXISTS

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

With Ω as any set and A a subset of Ω

A

{ empty set, A, Ω\A, Ω} is σ(A)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

If F_1, F_2 ,.. are sigma fields then

A

Write sigma( F_1,F_2,..) for the smallest sigma-algebra wrt which all events are measurable

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
20
Q

Properties of events in F

A

If A is an element of F then Ω\A element of F and since Ω =A ∪(Ω\A) we have 1=P[A] + P[Ω\A]
=P(Ω)=P(A ∪ (Ω\A))

If A,B in F and A subset of B then B=A Union (B\A)
Which gives

P[B] = P[B\A] + P[A]
as P[B\A] is bigger than or equal to 0
Implying
P[A] is less than or equal to P[B]

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
21
Q

Lemma 2.1.8

A

Let A_1, A_2,.. in F where F is a sigma-field. Then intersection from i=1 to infinity of A_i in F

(Countable)

Proof/ we can write

*in general uncountable unions and intersections of measurable sets need not be measurable but lemma may not hold so that F isn’t too large for a Probability measure as harder to define probabilities the bigger F is

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
22
Q

The empty set

A

Is an element of F the sigma-field and is the test that nothing happens

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
23
Q

Set of all subsets of Ω

A

The set of all subsets of Ω is a sigma field

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
24
Q

what is the smallest sigma field of unions?

A

σ(F_1,F_2,…)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
25
Q

Random variables

pre-image

A

X:Ω to R. For each outcome ω ∈ Ω
X(ω) is a property .
X^-1 (A) = {ω ∈ Ω : X(ω) ∈ A}

is a PRE-IMAGE ( not inverse) of A under X.

and used to find set of outcomes ω ∈ Ω mapping to some set A under X.

X-1(a,b) for interval (a,b)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
26
Q

EXAMPLE

Ω={1,2,3,4,5,6}
and F=P(Ω).

Property X(ω)=
{0 if ω is odd
{1 if ω is even

ω ∈ Ω

A
X-1({0}) = {1,3,5}
X-1({1}) = {2,4,6}
X-1({0,1}) = {1,2,3,4,5,6}
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
27
Q

DEF 2.2.1:

Definition of a random variable/measurable function

A

Let G be a σ-field. A function X:Ω to R is G-measurable if for all subintervals I⊆R we have X-1(I) ∈ G

For a probability space (Ω , F,P) we say that X:Ω to R is a RANDOM VARIABLE if X if F-measurable

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
28
Q

measurable notes:

A
  • X is measurable (using σ-field G) I.e. x∈ mG
  • P[X∈ A] = P[X-1(A)] (X-1(A) ∈ F)

*P[a less than X less than b]
= P[X-1(a,b)] = P[ω ∈ Ω ; X(ω) ∈ (a,b) ]

*P[X=a]= P[X-1(a)] = P[ω ∈ Ω ; X(ω)=a ]

29
Q

EXAMPLE: toss a coin twice

A
Ω= {HH, HT, TH, TT} 
F= P(Ω) then every funct X:Ω to R is F-measurable

G= {∅, {H,H}, {HT,TT, TH}, Ω }
(this is info of “did we get 2 heads” with sigma-field that gives this)

if we are interested in #tails:

X: Ω to R given by X(ω) = #total tails occurred
Then X is NOT G-measurable

ie if all we knew was whether or not we had HH, we can’t work out exact #tails

30
Q

We need to be able to deduce the info from the given info

A

We can take the preimage of an interval eg [0,1] we can find set is not an element of G ( X-1([0,1]) = {HH,HT,TH} is not an element of G) so X is not G measurable

31
Q

information and sigma-fields:

A

σ-field: G chooses which info we care about
X is G-measurable ie X∈mG: X depends only on the information in G / information we care about

given an outcome ω of our experiment, but not knowing which ω ∈ Ω it was, as each event “G” in G represents a piece of info this info is whether or not ω ∈ G ( ie whether or not event “G” has occurred), If this info allows us to deduce the exact value of X(ω) and if this is the case for any ω ∈ Ω then X is G-measurable

32
Q

G and “G”

A

in notes G is written curly eg similar to g (as is F)

writing G as “G” in flashcards

33
Q

σ-field generated by random variables

A

X is RV

A σ-field is σ(X), containing information given by X

34
Q

LEMMA 2.2.5 σ(X)

A

X is σ(X)-measurable

σ(X) is a σ-field, containing information given by X

35
Q

σ(X)

A

to construct:

include ∅ and Ω
look at preimages of X
look at complements
look at unions

36
Q

remember

A

operations on random variables produce random variables

37
Q

Lemma 2.2.2:
If X is a discrete RV:

suppose we have (Ω, F, P) and a RV X:Ω to R and we want to check that X is measurable wrt some smaller σ-field

A

Let G be a σ-field on Ω. Let X:Ω to R and suppose X takes values {x_1,x_2,…} ( a countable set).

Then

X is measurable wrt G/ X∈ mG

for all j, {X=x_j} ∈ G

38
Q

PROOF lemma 2.2.2:
countable set measurable wrt G

each element in G

A

..

39
Q

example: take Ω={1,2,3,4,5,6}m rolling a dice
F= P(Ω)

consider
G_1 ={∅, {1,3,5}, {2,4,6}, Ω}

G_1 ={∅, {1,2,3}, {4,5,6}, Ω}

A

G_1 ={∅, {1,3,5}, {2,4,6}, Ω}
“test is Ω even or odd”

G_1 ={∅, {1,2,3}, {4,5,6}, Ω}
“test is Ω less than or equal to 3 or bigger than 2”

Here, G_1 contains the info of whether roll even or odd etc. We can check both are σ-fields

DEFINE
X_1(ω) =
{0 if ω is odd
{1 if ω is even

X_2(ω) =
{0 if ω is less than or equal to 3
{1 if ω is bigger than 3.

X_1 tests if X is even
X_2 tests is X is odd

we expect that X_1 is measurable wrt G_1 but not wrt G_2 etc

  • X_1^-1(0) ={1,3,5}
  • X_1^-1(1) ={2,4,6}
    both are in G_1 but not in G_2 so
    X_1 is measurable wrt G_1 but not wrt G_2
similarly
* X_2^-1(0) ={1,2,3} 
* X_2^-1(1) ={4,5,6} 
both are in G_2 but not in G_1 so 
X_2 is measurable wrt G_2 but not wrt G_1
40
Q

Example: consider generated σ-field and smallest contained events

A

G_3 = σ( {1,3}. {2}, {4}, {5}, {6}) has 32 elements but

the information given cant tell 1 from 3
X-1(0) = {1,3,5} = { {1}, {1,3}, {3}} ∪ {5}
as {1,3} is an element of G_3 and {5} is we also have
{ {1}, {1,3}, {3}} ∪ {5} is an element of G_3

X-1(1) = {2,4,6} = {2}∪ {4} ∪ {6}
each is an elemnt of G_3 and so union is an elemnt of G_3
by def 2.1.7
we thus have

X_1 is measurable wrt G_3 and X_2 is also

41
Q

σ-field are associated to

A

σ-field are associated to each function X:Ω to R

42
Q

DEF 2.2.4 σ-field generated by X

A

The σ-field generated by X, denoted σ(X) is

σ(X)= σ( X-1(I) : I is a subinterval of R)

43
Q

the smallest σ-field that contains events

F=P(Ω)

Ω=(1,2,3,4,5,6}

X(ω) =
{1 if ω is odd
{2 if ω is even

A
X-1(1) = {1,3,5}
X-1(2) = {2,4,6}

thus the smallest σ-field that contains events is

σ(X) = {∅, {1,3,5}, {2,4,6}, Ω}

ie we don’t need any more elements as all unions and complements are contained

44
Q

the smallest σ-field that contains events

F=P(Ω)

Ω=(1,2,3,4,5,6}

Y(ω) =
{1 if ω=1
{2 if ω=2
{3 if ω=3

A
Y-1(1) = {1}
Y-1(2) = {2}
Y-1(3) = {3}

thus the smallest σ-field that contains events is

σ(Y) = {∅,
{1}, {2},{3}, {2,3,4,5,6}, {1,3,4,5,6}, {1,2,4,5,6},
{1,3}, {1,2}, {2,3} , {2,4,5,6}, {3,4,5,6}, {1,4,5,6},
{1,2,3},{4,5,6}
Ω}

taking unions and complements etc to include the 3 events, empty set, sample space and all their complements and unions

45
Q

LEMMA 2.2.5: σ(X) measurable

A

Let X:Ω to R. Then X is σ(X) measurable

46
Q

Let X:Ω to R. Then X is σ(X) measurable

PROOF:

A

47
Q

Define σ(X_1, X_2,…)

A

σ(X_1, X_2,…) = σ(X_i(I)) : X_i is element of sequence and I is a subinterval of R

this is the σ-field containing the info given by X_1, X_2,…

48
Q

2.2.4 gives us:

Considering a collection of RVs its better to consider a sub-σ-field G ⊆F

Let α∈R and let X, Y, and X_1,_X_2,… be G-measurable functions Ω to R

A

Then

α, αX, X+Y, XY, 1/X

are all G-measurable.

Further, if X_∞ given by
X_∞ (ω) = limit as n tends to infinity of X_n(ω)
exists for all ω,
then X_∞ is G-measurable

49
Q

EXAMPLE: if X∈mG then

A

Then (X^2 + X)/ 2 ∈mG

Y= e^X then Y ∈mG

because Y(ω)= sum from n=0 to infinity X(ω)^n/(n!)

we know this limit exists as e^x is defined for all x in the reals consider the partial sums
Y_N (ω) = sum from n=0 to N of X(ω)^n /n! ∈mG by (1) and Y(ω) = limit as N to infinity Y_N(ω) exists so Y ∈mG

IF X ∈mG and g: R to R is any SENSIBLE FUNCTION then Y=g(X) ∈mG

ie all powers, trig functs, e^x limit, polynomial, monotone functions, all piecewise linear functs

50
Q

RECALL FOR INDEPENDENT EVENTS

A

Events E_1, E_2

are independent if P( E_1 ∩ E_2) = P(E_1) P(E_2)

ie change that E_1 and E_2 happens is
chance that E_1 * chance that E_2

51
Q

2.2.7 INDEPENDENCE

we define independence of events using sigma fields

A

Sub- σ-fields G_1, G_2 of F are indep if

(note notation F is curly F, G is curly G , “G” is G)

P(“G”_1 ∩ “G”_2) = P(“G”_1)P(“G”_2) for all “G_1” in G_1 and “G_2” in G_2

Events E_1 and E_2 are independent if

σ(E_1) and σ(E_2) are independent

Random vars X_1 and X_2 are indep if σ(X_1) and σ(X_2) are independent

52
Q

2.3 INFINITE SAMPLE SPACE

for probability space

(Ω, F, P)

A

Recall if Ω= {x_1, x_2,…,x_n} is finite. We normally take F=P(Ω).
Since F contains every subset of Ω,
any σ-field on Ω is a sub-σ-field of F. We have seen how it is possible to construct other σ-fields
on Ω too.
In this case we can define a probability measure on Ω
We can define P by choosing some sequence (a_1, a_2,..,a_n) st a_i ∈ [0,1] and sum from i=1 to n of a_i = 1
and set P({x_i}) = a_i

This naturally extends to defining
P[A] for any subset A ⊆ Ω, by setting
more generally P(A) = sum from i=1 to n of P({x_i}) * (indicator function of x_i ∈ A)

If Ω is countable we have Ω = {x_1,x_2,..} can replace n with infinity. Infinite sequence transformed into infinite series which is bounded above by 1.

53
Q

EXAMPLE:

toss a coin countably many times. Outcome is ω= ω_1ω_2…. = (ω_1, ω_2,…)

where each ω_i is in {H,T}
the set of ω is uncountable :
Ω= {H,T}^N

Define X_n(ω) = ω_n = result of the nth toss

σ(X_1)
σ(X_1, X_2)

A
Define X_n(ω) = ω_n  = result of the nth toss
X_n: Ω to {H,T}
this isn't a subset of R but we can imagine as 0 or 1 etc

We want F = σ(X_1, X_2,..)
= σ(X^{-1}_i (I): i in N, I is a subinterval of R)

contains all info generated by coin tosses

NOTE
σ(X_1) = ( { ∅, {H…}, {T…} , Ω}

ie if first toss is H or T followed by anything else

σ(X_1) = ( { ∅, {H…}, {T…} , Ω}

σ(X_1, X_2) = σ( {HH..}, {TH…}, {HT..}, {TT…})
= {∅, Ω,
{HH…}, {TH..}, {HT..}, {TT…},
{H..}, {T…}, {H..}, {T..},
{HH..,
TT
..},
{HT..,
TH
..},
{HH..}^c, {TH…}^c, {HT..}^c, {TT..}^c }

ie considering first two tosses unions and complements

if 2 ω have same 1st and 2nd outcomes they fall into same subset of σ(X_1, X_2)

if a RV dep on anything more than 1st or 2nd outcomes will not be σ(X_1, X_2) measurable

54
Q

HOW TO DEFINE P?

A

when using infinite Ω will be given probability measure P and properties

55
Q

EXAMPLE HERE GIVEN: P: F to [0,1]

1) X_n are indep RVs
2) P(X_n = H) = P(X_n = T) = 0.5 for all n in N

A

from this we work with P without being constructed. Don’t need to know which subsets of Ω in F as 2.2.6 allows us

if we try to take F= P(Ω) there is no probability measure such that 1 and 2 are satisfied

56
Q

2.3.2 Ω={H,T} sequence if fair independent coin tosses

P(X_i = T) = P(X_2 =H) = 0.5

P(X_1 = ω_1, X_2 = ω_2, X_3 = ω_3,….)

P[X_1 =H]
P[ eventually throw a head]
P[never throw a head ]

A

P(X_1 = ω_1, X_2 = ω_2, X_3 = ω_3,…) =
P(X_1 =ω_1)P(X_2= ω_2)P(X_3 = ω_3)……
= 0.5 0.5 0.5 *… = 0

e.g the probability that we never throw a head

So

P[X_1 =H] = P[ {ω_i} in Ω : ω_1 =H] = 0.5

P[ eventually throw a head] = P[for some n, X_n =H] = 1- P[for all n X_n =T]=1-0=1 event has probability 1 but not equal to whole sample space
P[never throw a head ] = P[for all n X_n =T]= 0.50.5…=0

57
Q

DEF 2.3.2

ALMOST SURELY

A

if the event E has P(E) =1 we say E occurs ALMOST SURELY

ie coin will eventually throw a headie Y less than or equal to 1 almost surely

IFF P[Y less than or equal to 1] =1

58
Q

DEFINE

proportions of heads and tails in the first n tossesX_1,… X_n are q_n ^H and q_n ^T

A

q_n ^H = (1/n) sum from i=1 to n of (indicator funct of X_i =H)

q_n ^T = (1/n) sum from i=1 to n of (indicator funct of X_i =T)

q_n ^T + q_n ^H = 1

the random vars indicator of {X_i=H} are iid with
E[1_{X_i=H}] =0.5

hence by thm 1.1.1 we have P[q_n ^T tends to 0.5 as n tends to infinity] = 1

by strong law of large #
& P[q_n ^H tends to 0.5 as n tends to infinity] = 1

event that half tosses H and half are T
is event E = {limit as n tends to infinity of q_n ^T = 0.5 and limit as n tends to infinity of q_n ^H = 0.5}
occurs almost surely

as P[q_n ^H tends to 0.5 , q_n ^T tends to 0.5 ] =1

59
Q

EXPECTATION

A

if X is continuous RV E[X] = integral from -infinity to infinity of [xf_x(alpha).dx] (pdf)

if X is discrete RV E[X] = sum of x in R_x of [xP[X=x]]
(R_x is range of x)

60
Q

WHEN X IS AN F-MEASURABLE FUNCTION FROM Ω to R, RVS MIGHT NOT BE DISCRETE OR CONTINUOUS .

A

we use Lebesgue integration to define E(X) in this case

61
Q

E[X] defined:

A

E[X} is defined for all X se either:

1) X is bigger than or equal to 0, in which case its possible that E[X] = infinity
2) General X for which E[ |X|] less than infinity
* we haven’t got E[X] = - infinity
* we use the modulus to avoid “infinity take away infinity”

***if X is bigger than or equal to 0 and P[X= infinity] bigger than 0, this implies E[X] = infinity
(the chance of X being infinite will outweigh all of the finite possibilities and make E[X] Infinite)

62
Q

PROPOSITION 2.4.1

Expectations for RVs X and Y

A

For random variables X, Y such that E[X] and E[Y] exist:

LINEARITY
if a,b in R then E[ aX +bY] = a E[X] + bE[y]

INDEPENDENCE
if X, Y are indep then E[XY] = E[X]E[Y]
ABSOLUTE VALUES
|E[X]| is less thn or equal to E[|X|]
MONOTONICITY
if X is less than or equal to Y then E[X] is less than or equal to E[Y]
63
Q

indicator function for events

A

for event A in F,
1_A: Ω to R

1_A(ω) =
{1 if ω is in A
{0 if ω isn’t in A

E(1_A) = 1P(A) + 0(P(A^c))
=P(A)

  • sometimes write 1_{event} = 1{event}
64
Q

LEMMA 2.4.2

indicator function for events

A

Let A∈G
(remember G is curly G similar to g)

Then function 1_A is G-measurable

proof:
Range of 1_A is {0,1}.
Pre-images are:
1_A -1 (0) =Ω\A ∈G
1_A -1 (1) = A ∈G
by2.2.2, Y is G-measurable
65
Q

CONDITIONING

A

breaking RV into cases

for example, given any random var X we write:
X= X 1{X≥1} + X1{X less than 1}

( as only one of the RHS terms is non-zero)

66
Q

for example, given any random var X we write:
X= X 1{X≥1} + X1{X less than 1}

( as only one of the RHS terms is non-zero)

we use this to prove an inequality:

A

putting |X| in place of X then taking the expectation to obtain:
E[|X|] = E[|X| 1{|X|≥1} ] + E[|X| 1{|X|less than 1} ]

E[X^2 1_{|X|≥1} ] + 1

E[X^2] +1

(to reduce second key point we can only use |x| is less than or equal to x^2 if x is bigger than or equal to 1)

ie we want to prove
E[|X|] ≤E[X^2] +1

|X| = |X| 1{|X|≥1} + |X| 1{|X| less thn 1}
BY MONOTONICITY:

FIRST TERM
|X| 1{|X|≥1} ≤ X^2 1{|X|≥1}

assume |X| ≥ 1 and for reals X ≤ X^2
SECONT TERM
|X| 1{|X|less thn 1}≤ 1{|X| Less thn 1}

assume |X| less than 1 and for reals X ≤ 1

|X| = |X| 1{|X|≥1} + |X| 1{|X| less thn 1}
≤ X^2* 1{|X|≥1} + 1 *1{|X|less thn 1}
≤ X^2 + 1

by monotonicity of expectation:
E(|X|) ≤ E [1 +X^2] -1 + E[X^2]

67
Q

DEF 2.4.3 L^p

A

Let p∈ [1, infinity)
we say that X ∈ L^p
if E[|X|^p] is less than infinity
we usually care about cases p = 1 or p = 2

68
Q

L^1 and L^2

A

*by definition L^1 is the set of random variables such that E[|X|] is less than infinity
ie that its finite

  • L^2 is the set of random variables such that Var(X) is less than infinity

Var(X) = E(X^2) - E(X)^2 ie finite var

  • from 2.7, if X ∈L^2 then X∈L^1
69
Q

DEF 2.4.4

bounded

A

We say that a random variable X is bounded if there exist a deterministic constant c ∈R such that |X| ≤ c

if X is bounded then using monotonicity:
E[|X|^p] ≤ E[C^p] =c^p less than infinity

meaning that X ∈ L^p for all p