Module 9 Vocab Flashcards
Random Variable (textbook def)
9.1
a random variable on (Ω, Pr) is a function
X: Ω →ℝ
Random Variable (Alex def)
9.1
a function that maps each outcome in the sample space to a real number
The set of values taken by X
9.1
Val(X) = {x ∈ ℝ| ∃w ∈ Ω X(w) = x}
Another way to say “the set of values taken by X”
9.1
returned
X = x
9.1
the set of all outcomes that are mapped to the real number “x”
Pr[X = x]
9.1
the probability of the outcomes that are mapped to “x”
probability of event X = x
Distribution of r.v. X
9.1
f: Val(X) → [0,1] where f(x) = Pr[X = x]
Ex in green die space: f(1) = Pr[X=1] = 1/6
Probabilities add up to 1 (symbols)
9.1
Σ x∈Val(X) Pr[X = x] = 1
Ex: flip coin space Pr[X=H] + Pr[X=T] = 1/2 + 1/2
Probabilities add up to 1 (words)
9.1
“if you sum all the probabilities of the events that the random variable X=x, for all the x in the valus taken by the r.v. X, the answer is 1”
Events (X=x) for x∈Val(X) are…
9.1
pairwise disjoint
Ux∈Val(X) (X = x) =
9.1
Ω
Σx∈Val(X) Pr[X = x] =
9.1
Pr[Ω] = 1
Σ x∈Val(X) Pr[X = x] = 1
9.1
“if you sum all the probabilities of the events that the random variable X=x, for all the x in the valus taken by the r.v. X, the answer is 1”
Uniform Distribution
9.1
f: {v_1,…,v_n} → [0,1] f(v_i) = 1/n i = 1,…,n
Given (Ω, Pr), a r.v. U: Ω → ℝ is uniform with these values (v_1,…,v_n) when:
9.1
Val(U) = {v_1,…,v_n}
and Pr[U = v_i] = 1/n i = 1,…,n
Val(U) = {v_1,…,v_n}
and Pr[U = v_i] = 1/n i = 1,…,n
9.1
uniform r.v.
f: {v_1,…,v_n} → [0,1] f(v_i) = 1/n i = 1,…,n
9.1
uniform distribution
Bernoulli r.v. with parameter p
9.1
Given (Ω, Pr), an r.v. X: Ω → ℝ
with Val(X) = {0, 1}
and Pr[X = 1] = p
Val(X) = {0, 1}
and Pr[X = 1] = p
9.1
Bernoulli r.v. with parameter p
Distribution of Bernoulli r.v.
9.1
f: {0,1} → [0, 1]
f(1) = p f(0) = 1 - p
f: {0,1} → [0, 1] f(1) = p f(0) = 1 - p
9.1
Bernoulli distribution with parameter p
How many ways can 3 dice sum up to 5?
9.1
Stars and bars (5-3) + 3 - 1 choose 3 - 1 = 6
Stars and bars with restriction of each die
9.1
From “n” subtract how many are used to meet condition.
Ex. if there are 5 die then n - 5
Ex. if there are 10 die, then n - 10
Expectation/Expected Value
9.2
average (mean) value returned by a r.v.
E[X]
9.2
the expectation of a r.v. X
Expected Value of X (symbols)
9.2
E[X]
E[X] way 1
9.2
x∈Val(X) Σ x ⋅ Pr[X = x]
value returned times its weight
E[X] way 2
9.2
w∈Ω Σ X(w) ⋅ Pr[w]
value mapped to by outcome times probability of the outcome
E[D]
9.2
3.5
remember: D is the number shown by a fair dice
Expectation of a uniform r.v.
9.2
that takes the values v1,…,vn:
v1 +…+ vn / n
Expectation of the Bernoulli r.v.
9.2
recall Val(X) = {0, 1} and Pr[X=1] = p and Pr[X=0] = 1 - p
E[X] = 1 ⋅ Pr[X=1] + 0 ⋅ Pr[X=0]
= 1⋅ p + 0 ⋅ (1 - p)
= p
p
9.2
Expectation of the Bernoulli r.v.
E[C] = c
9.2
expectation of a constant r.v. proposition
Expectation of a constant r.v.
9.2
consider the r.v. C: Ω → ℝ such that for all outcomes w∈Ω we have C(w) = c
E[C] = c
Proof of equivalence of the two defs of expectation biconditional statement
9.2
w ∈ [X = w] iff X(w) = x
What is X=x?
an event
How to use uniform r.v.
state “n” and “v_i = i for i = #,…,#”
remember Pr[X =v_i] = 1/n for i=1,…,n
How to use Bernoulli r.v.
state 2 outcomes and give p
Anagrams formula
a x b where a is number of b’s
(a1 +…+ an)! / a1! x … x an!
Sum of two r.v.’s
(X+Y)(w) = X(w) + Y(w)
add the value that X maps the outcome to and the value that Y maps the outcome to
Example of sum of two r.v.’s
S(7) = G(2) + P(5)
Scalar multiplication of a r.v.
(cX)(w) = c X(w)
Tricky thing to remember about LOE +
for something like A - B you can do A + (-1)B
Linearity of Expectation
only for r.v.’s on the same space
version 1: E[cX1 +…+ cXn] = cE[X1] +…+ cE[Xn]
version 2: E[X1 +…+ Xn] = E[X1] +…+ E[Xn]
LOE example rolling fair die independently r times
E[W] = E[D1] +…+ E[Dr]
because rolls are independent we can assert that the probability distribution of each roll is the same as the probability distribution of a single die roll which we know E[D] = E[Di] = 3.5 for i = 1,…,r
Thus E[W] = 3.5r
What do you need before you use an indicator r.v.?
an event!
Indicator r.v.
(3 things)
let Ia be the indicator r.v. for event a
Ia(w) = 1 if w is in a and 0 if w is not in a
IT’S BERNOULLI
E[indicator r.v. of event A] =
Pr[IA = 1] = Pr[A]
An outcome with k H’s has probability
p^k times q^n-k
where q = 1 - p
The expectation of the indicator r.v. is equal to
the probability of its event
In order to be able to apply Pr[Hi] to all H_i’s what do we need to know?
the flips/tosses/blahs are I-N-D-E-P-E-N-D-E-N-T
On average means we need which concept?
Expectation!
Remember if events are independent what can we do to their probabilities?
multiply them!
Pr[success followed by failure] =
p(1-p)