Module 10 Vocab Flashcards
R.v.’s X⊥Y if:
10.1
defined on the same Ω
∀x ∈ Val(X)
∀y ∈ Val(Y)
(X=x)⊥(Y=y)
Example of independent r.v.’s in green-purple space
10.1
G returns number shown on green die, P returns purple
For any g,p ∈ [1..6]:
Pr[(G=g)∩(P=p)] = (1/6)(1/6) = 1/36 = Pr[G=g] ⋅ Pr[P=p]
Pairwise and Mutual Independence
10.1
analogous to events
Independence of Indicators and Events (words)
10.1
“for events A,B in same space, their indicator r.v.’s are independent iff the events themselves are independent”
Independence of Indicators and Events (symbols)
10.1
Let A,B be two events in (Ω,Pr)
I_A ⊥ I_B iff A⊥B
How do you prove iff statements?
10.1
Prove one way
Then prove the other way
Independence of indicators and events (extension)
10.1
I_A ⊥ I_B iff A⊥B
can be extended to mutual independence of an arbitrary number of r.v.’s
By def, I_A ⊥ I_B tells us
10.1
(I_A = 1) ⊥ (I_B = 1)
(I_A = any #) ⊥ (I_B = any #)
By def of indicator r.v. we know
10.1
(I_A = 1) = A and (I_B = 1) = B
Expectation of indicator r.v.
E[I_X] =
10.1
Pr[I_X = 1] = Pr[X]
How many cases does one indicator r.v. have?
10.1
0 and 1
or x and ˉx
IDD Bernoulli trials “performed independently” assume…
10.1
that the Bernoulli r.v. are mutually independent
Constant r.v.’s are ________ of any r.v.
10.1
independent
Independence of Constant r.v.’s
10.1
for any (Ω, Pr) and any r.v. X: Ω→ℝ and any c∈ℝ, the constant r.v. C: Ω→ℝ defined by ∀w∈Ω C(w) = c is independent of X
What do we know about Val(C) of constant r.v. C:Ω→ℝ
10.1
Val(C) = {c} so we also know (C = c) = Ω
μ
10.2
mean = expectation = E[X]
Deviation of X from its mean
10.2
X - μ, also a r.v.
Reminder about a r.v. whose values are constant
10.2
has that same constant as its expectation
E[μ] = μ
Variance of a r.v.
10.2
Var[X] = E[(X-μ)²]
where μ = E[X]
Standard Deviation of a r.v.
10.2
σ[X] = √Var[X]
μ
10.2
Mean
σ
10.2
Standard Deviation
σ²
10.2
Variance
Variance (alternative formula)
10.2
Let X be a r.v. defined on (Ω, Pr)
Var[X] = E[X²] - μ²
where ∀w∈Ω, X²(w) = ((X(w))²)
E[X + 2μ]
10.2
By LOE and expectation of constants
= E[X] + 2μ
if X is an r.v. that returns positive values, then the —– of X² is the values —– with the —- probabilities
10.2
distribution
squared
same
If Val(D)=[1..6], then Val(D²) =
10.2
Val(D²) = {1, 4, 9, 16, 25, 36}
Distribution of # shown by fair die
10.2
X = {1, 2, 3, 4, 5, 6} each with probability 1/6
Bernoulli r.v. Expectation
10.3
E[X] = p
X²(w) = 1 iff
10.3
X(w) = 1
X²(w) = 0 iff
10.3
X(w) = 0
If X is Bernoulli…
10.3
so is X²
If X is Bernoulli, then E[X] =
10.3
E[X²] = p
Variance of Bernoulli r.v. with parameter Pr[X=1] = p
10.3
Var[X] = p(1-p)
Bernoulli Expectation and Var
10.3
different expectation, same variance
Ex: p = 1/3 and q = 2/3
Var[cX] =
10.3
c² Var[X]
c² Var[X] =
10.3
Var[cX]
if X⊥Y, then Var[X + Y]
10.3
Var[X+Y] = Var[X] + Var[Y]
Var[X+Y] = Var[X] + Var[Y] iff
10.3
E[XY] = E[X] ⋅ E[Y]
X⊥Y ⇒
10.3
E[XY] = E[X] ⋅ E[Y]
Variance for sum of 2 dice in green-purple space
10.3
S = G + P and G⊥P, so
Var[S] = Var[G] + Var[P]
Binomial r.v. (definition)
10.4
parameters: n∈ℕ and p∈[0,1]
Val(B) = [0..n] and ∀k ∈ [0..n]
Pr[B=k] = (n choose k) p^k (1-p)^n-k
Binomial r.v. with parameters:
n∈ℕ and p∈[0,1]
10.4
Val(B) = [0..n] and
∀k ∈ [0..n] Pr[B=k] = (n choose k) p^k (1-p)^n-k
Distribution of Binomial r.v. B
10.4
also called binomial with parameters n and p
Example of binomial r.v.
10.4
probability of event “k successes observed” during n IDD Bernoulli trials with probabilty of success p and B r.v. that returns number of successes observed
What are expectation and variance completely determined by?
10.4
the distribution
— and — are completely determined by the distribution
10.4
expectation and variance
Don’t forget: indicator r.v.’s are
10.4
BERNOULLI!
reminder: expectation of indicator r.v.
E[I_A] =
10.4
Pr[A] = p
Expectation of Binomial r.v.
10.4
E[B] = np
Pairwise Independence and Variance
10.4
If r.v.’s X1,…,Xn are pairwise independent,
then Var[X1+…+Xn] = Var[X1] +…+ Var[Xn]
If r.v.’s X1,…,Xn are pairwise independent, then
10.4
Var[X1+…+Xn] = Var[X1] +…+ Var[Xn]
If the events are mutually independent, then their indicator r.v.’s…
10.4
are also mutually independent
Reminder: variance of Bernoulli r.v.’s with parameter p =
10.4
p(1-p)
Variance for Binomial r.v.
10.4
Var[B] = np(1-p)
X,Y are two r.v.’s on (Ω, Pr), their product, XY, is the r.v.
Self-Paced
(XY)(w) = X(w) ⋅ Y(w)
∀w∈Ω
Product of two r.v.’s
Self-Paced
(XY)(w) = X(w) ⋅ Y(w)
∀w∈Ω
Product of two r.v.’s (words)
Self-Paced
For every outcome ω in the sample space, you multiply the value that X maps ω to
by the value that Y maps ω to
in order to obtain the value that the product XY maps ω to
X_H * X_T(HH) =
Self-Paced
X_H * X_T(TT) = 0
X_H * X_T(HT) =
Self-Paced
X_H * X_T(TH) = 1
X_H * X_T is a…
Self-Paced
Bernoulli r.v. with parameter 1/2
Correlated r.v.’s
Self-Paced
X,Y r.v. on (Ω,Pr)
E[XY] ≠ E[X] ⋅ E[Y]
Uncorrelated r.v.’s
Self-Paced
X,Y r.v. on (Ω,Pr)
E[XY] = E[X] ⋅ E[Y]
Variance distributes over the sum of — r.v.’s
Self-Paced
uncorrelated
Negative Correlation
Self-Paced
Higher A means lower B
Covariance
Self-Paced
Cov(X,Y) = E[XY] - E[X]E[Y]
Cov(X,Y) =
Self-Paced
E[XY] - E[X]E[Y]
two r.v.’s are — iff their covariance is —
Self-Paced
uncorrelated, 0
for two r.v.’s:
all — r.v.’s are —
but not all — r.v.’s are —-
Self-Paced
independent, uncorrelated
uncorrelated, independent
Correlation is a measure of — —
Self-Paced
linear dependence
If A⊥B
Pr[A|B] =
Pr[B|A] =
Self-Paced
Pr[A|B] = A
Pr[B|A] = B
Alt Var[X+Y] =
Self-Paced
Var[X+Y] = Var[X] + Var[Y] - 2(E[XY] - E[X]E[Y])
If X1,…,Xn are pairwise uncorrelated, then:
Var[X1+…+Xn] =
Self-Paced
Var[X1] +…+ Var[Xn]
Expectation of a product of two r.v.’s
E[XY] =
Self-Paced
x∈Val(X)Σ y∈Val(Y)Σ Pr[(X=x)∩(Y=y)] ⋅ x ⋅ y
Intersection and Independence,
If X⊥Y
Pr[(X=x)∩(Y=y)] = Pr[X=x] ⋅ Pr[Y=y]
What 3 things do you need to do in order to use binomial r.v.?
- make it clear what the n Bernoulli trials are
- explain why they are IID
- specify p