Module 10 Vocab Flashcards

(76 cards)

1
Q

R.v.’s X⊥Y if:

10.1

A

defined on the same Ω
∀x ∈ Val(X)
∀y ∈ Val(Y)
(X=x)⊥(Y=y)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Example of independent r.v.’s in green-purple space

10.1

A

G returns number shown on green die, P returns purple
For any g,p ∈ [1..6]:
Pr[(G=g)∩(P=p)] = (1/6)(1/6) = 1/36 = Pr[G=g] ⋅ Pr[P=p]

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Pairwise and Mutual Independence

10.1

A

analogous to events

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Independence of Indicators and Events (words)

10.1

A

“for events A,B in same space, their indicator r.v.’s are independent iff the events themselves are independent”

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Independence of Indicators and Events (symbols)

10.1

A

Let A,B be two events in (Ω,Pr)
I_A ⊥ I_B iff A⊥B

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

How do you prove iff statements?

10.1

A

Prove one way
Then prove the other way

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Independence of indicators and events (extension)

10.1

A

I_A ⊥ I_B iff A⊥B
can be extended to mutual independence of an arbitrary number of r.v.’s

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

By def, I_A ⊥ I_B tells us

10.1

A

(I_A = 1) ⊥ (I_B = 1)
(I_A = any #) ⊥ (I_B = any #)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

By def of indicator r.v. we know

10.1

A

(I_A = 1) = A and (I_B = 1) = B

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Expectation of indicator r.v.
E[I_X] =

10.1

A

Pr[I_X = 1] = Pr[X]

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

How many cases does one indicator r.v. have?

10.1

A

0 and 1
or x and ˉx

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

IDD Bernoulli trials “performed independently” assume…

10.1

A

that the Bernoulli r.v. are mutually independent

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Constant r.v.’s are ________ of any r.v.

10.1

A

independent

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Independence of Constant r.v.’s

10.1

A

for any (Ω, Pr) and any r.v. X: Ω→ℝ and any c∈ℝ, the constant r.v. C: Ω→ℝ defined by ∀w∈Ω C(w) = c is independent of X

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

What do we know about Val(C) of constant r.v. C:Ω→ℝ

10.1

A

Val(C) = {c} so we also know (C = c) = Ω

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

μ

10.2

A

mean = expectation = E[X]

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

Deviation of X from its mean

10.2

A

X - μ, also a r.v.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

Reminder about a r.v. whose values are constant

10.2

A

has that same constant as its expectation
E[μ] = μ

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

Variance of a r.v.

10.2

A

Var[X] = E[(X-μ)²]
where μ = E[X]

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
20
Q

Standard Deviation of a r.v.

10.2

A

σ[X] = √Var[X]

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
21
Q

μ

10.2

A

Mean

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
22
Q

σ

10.2

A

Standard Deviation

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
23
Q

σ²

10.2

A

Variance

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
24
Q

Variance (alternative formula)

10.2

A

Let X be a r.v. defined on (Ω, Pr)
Var[X] = E[X²] - μ²
where ∀w∈Ω, X²(w) = ((X(w))²)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
25
E[X + 2μ] | 10.2
By LOE and expectation of constants = E[X] + 2μ
26
if X is an r.v. that returns positive values, then the ----- of X² is the values ----- with the ---- probabilities | 10.2
distribution squared same
27
If Val(D)=[1..6], then Val(D²) = | 10.2
Val(D²) = {1, 4, 9, 16, 25, 36}
28
Distribution of # shown by fair die | 10.2
X = {1, 2, 3, 4, 5, 6} each with probability 1/6
29
Bernoulli r.v. Expectation | 10.3
E[X] = p
30
X²(w) = 1 iff | 10.3
X(w) = 1
31
X²(w) = 0 iff | 10.3
X(w) = 0
32
If X is Bernoulli... | 10.3
so is X²
33
If X is Bernoulli, then E[X] = | 10.3
E[X²] = p
34
Variance of Bernoulli r.v. with parameter Pr[X=1] = p | 10.3
Var[X] = p(1-p)
35
Bernoulli Expectation and Var | 10.3
different expectation, same variance | Ex: p = 1/3 and q = 2/3
36
Var[cX] = | 10.3
c² Var[X]
37
c² Var[X] = | 10.3
Var[cX]
38
if X⊥Y, then Var[X + Y] | 10.3
Var[X+Y] = Var[X] + Var[Y]
39
Var[X+Y] = Var[X] + Var[Y] iff | 10.3
E[XY] = E[X] ⋅ E[Y]
40
X⊥Y ⇒ | 10.3
E[XY] = E[X] ⋅ E[Y]
41
Variance for sum of 2 dice in green-purple space | 10.3
S = G + P and G⊥P, so Var[S] = Var[G] + Var[P]
42
Binomial r.v. (definition) | 10.4
parameters: n∈ℕ and p∈[0,1] Val(B) = [0..n] and ∀k ∈ [0..n] Pr[B=k] = (n choose k) p^k (1-p)^n-k
43
Binomial r.v. with parameters: n∈ℕ and p∈[0,1] | 10.4
Val(B) = [0..n] and ∀k ∈ [0..n] Pr[B=k] = (n choose k) p^k (1-p)^n-k
44
Distribution of Binomial r.v. B | 10.4
also called binomial with parameters n and p
45
Example of binomial r.v. | 10.4
probability of event "k successes observed" during n IDD Bernoulli trials with probabilty of success p and B r.v. that returns number of successes observed
46
What are expectation and variance completely determined by? | 10.4
the distribution
47
--- and --- are completely determined by the distribution | 10.4
expectation and variance
48
Don't forget: indicator r.v.'s are | 10.4
BERNOULLI!
49
reminder: expectation of indicator r.v. E[I_A] = | 10.4
Pr[A] = p
50
Expectation of Binomial r.v. | 10.4
E[B] = np
51
Pairwise Independence and Variance | 10.4
If r.v.'s X1,...,Xn are pairwise independent, then Var[X1+...+Xn] = Var[X1] +...+ Var[Xn]
52
If r.v.'s X1,...,Xn are pairwise independent, then | 10.4
Var[X1+...+Xn] = Var[X1] +...+ Var[Xn]
53
If the events are mutually independent, then their indicator r.v.'s... | 10.4
are also mutually independent
54
Reminder: variance of Bernoulli r.v.'s with parameter p = | 10.4
p(1-p)
55
Variance for Binomial r.v. | 10.4
Var[B] = np(1-p)
56
X,Y are two r.v.'s on (Ω, Pr), their product, XY, is the r.v. | Self-Paced
(XY)(w) = X(w) ⋅ Y(w) ∀w∈Ω
57
Product of two r.v.'s | Self-Paced
(XY)(w) = X(w) ⋅ Y(w) ∀w∈Ω
58
Product of two r.v.'s (words) | Self-Paced
For every outcome ω in the sample space, you multiply the value that X maps ω to by the value that Y maps ω to in order to obtain the value that the product XY maps ω to
59
X_H * X_T(HH) = | Self-Paced
X_H * X_T(TT) = 0
60
X_H * X_T(HT) = | Self-Paced
X_H * X_T(TH) = 1
61
X_H * X_T is a... | Self-Paced
Bernoulli r.v. with parameter 1/2
62
Correlated r.v.'s | Self-Paced
X,Y r.v. on (Ω,Pr) E[XY] ≠ E[X] ⋅ E[Y]
63
Uncorrelated r.v.'s | Self-Paced
X,Y r.v. on (Ω,Pr) E[XY] = E[X] ⋅ E[Y]
64
Variance distributes over the sum of --- r.v.'s | Self-Paced
uncorrelated
65
Negative Correlation | Self-Paced
Higher A means lower B
66
Covariance | Self-Paced
Cov(X,Y) = E[XY] - E[X]E[Y]
67
Cov(X,Y) = | Self-Paced
E[XY] - E[X]E[Y]
68
two r.v.'s are --- iff their covariance is --- | Self-Paced
uncorrelated, 0
69
for two r.v.'s: all --- r.v.'s are --- but not all --- r.v.'s are ---- | Self-Paced
independent, uncorrelated uncorrelated, independent
70
Correlation is a measure of --- --- | Self-Paced
linear dependence
71
If A⊥B Pr[A|B] = Pr[B|A] = | Self-Paced
Pr[A|B] = A Pr[B|A] = B
72
Alt Var[X+Y] = | Self-Paced
Var[X+Y] = Var[X] + Var[Y] - 2(E[XY] - E[X]E[Y])
73
If X1,...,Xn are pairwise uncorrelated, then: Var[X1+...+Xn] = | Self-Paced
Var[X1] +...+ Var[Xn]
74
Expectation of a product of two r.v.'s E[XY] = | Self-Paced
x∈Val(X)Σ y∈Val(Y)Σ Pr[(X=x)∩(Y=y)] ⋅ x ⋅ y
75
Intersection and Independence, If X⊥Y
Pr[(X=x)∩(Y=y)] = Pr[X=x] ⋅ Pr[Y=y]
76
What 3 things do you need to do in order to use binomial r.v.?
1. make it clear what the n Bernoulli trials are 2. explain why they are IID 3. specify p