Module 10 Vocab Flashcards
R.v.’s X⊥Y if:
10.1
defined on the same Ω
∀x ∈ Val(X)
∀y ∈ Val(Y)
(X=x)⊥(Y=y)
Example of independent r.v.’s in green-purple space
10.1
G returns number shown on green die, P returns purple
For any g,p ∈ [1..6]:
Pr[(G=g)∩(P=p)] = (1/6)(1/6) = 1/36 = Pr[G=g] ⋅ Pr[P=p]
Pairwise and Mutual Independence
10.1
analogous to events
Independence of Indicators and Events (words)
10.1
“for events A,B in same space, their indicator r.v.’s are independent iff the events themselves are independent”
Independence of Indicators and Events (symbols)
10.1
Let A,B be two events in (Ω,Pr)
I_A ⊥ I_B iff A⊥B
How do you prove iff statements?
10.1
Prove one way
Then prove the other way
Independence of indicators and events (extension)
10.1
I_A ⊥ I_B iff A⊥B
can be extended to mutual independence of an arbitrary number of r.v.’s
By def, I_A ⊥ I_B tells us
10.1
(I_A = 1) ⊥ (I_B = 1)
(I_A = any #) ⊥ (I_B = any #)
By def of indicator r.v. we know
10.1
(I_A = 1) = A and (I_B = 1) = B
Expectation of indicator r.v.
E[I_X] =
10.1
Pr[I_X = 1] = Pr[X]
How many cases does one indicator r.v. have?
10.1
0 and 1
or x and ˉx
IDD Bernoulli trials “performed independently” assume…
10.1
that the Bernoulli r.v. are mutually independent
Constant r.v.’s are ________ of any r.v.
10.1
independent
Independence of Constant r.v.’s
10.1
for any (Ω, Pr) and any r.v. X: Ω→ℝ and any c∈ℝ, the constant r.v. C: Ω→ℝ defined by ∀w∈Ω C(w) = c is independent of X
What do we know about Val(C) of constant r.v. C:Ω→ℝ
10.1
Val(C) = {c} so we also know (C = c) = Ω
μ
10.2
mean = expectation = E[X]
Deviation of X from its mean
10.2
X - μ, also a r.v.
Reminder about a r.v. whose values are constant
10.2
has that same constant as its expectation
E[μ] = μ
Variance of a r.v.
10.2
Var[X] = E[(X-μ)²]
where μ = E[X]
Standard Deviation of a r.v.
10.2
σ[X] = √Var[X]
μ
10.2
Mean
σ
10.2
Standard Deviation
σ²
10.2
Variance
Variance (alternative formula)
10.2
Let X be a r.v. defined on (Ω, Pr)
Var[X] = E[X²] - μ²
where ∀w∈Ω, X²(w) = ((X(w))²)
E[X + 2μ]
10.2
By LOE and expectation of constants
= E[X] + 2μ
if X is an r.v. that returns positive values, then the —– of X² is the values —– with the —- probabilities
10.2
distribution
squared
same
If Val(D)=[1..6], then Val(D²) =
10.2
Val(D²) = {1, 4, 9, 16, 25, 36}
Distribution of # shown by fair die
10.2
X = {1, 2, 3, 4, 5, 6} each with probability 1/6
Bernoulli r.v. Expectation
10.3
E[X] = p
X²(w) = 1 iff
10.3
X(w) = 1