Module 10 Vocab Flashcards

1
Q

R.v.’s X⊥Y if:

10.1

A

defined on the same Ω
∀x ∈ Val(X)
∀y ∈ Val(Y)
(X=x)⊥(Y=y)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Example of independent r.v.’s in green-purple space

10.1

A

G returns number shown on green die, P returns purple
For any g,p ∈ [1..6]:
Pr[(G=g)∩(P=p)] = (1/6)(1/6) = 1/36 = Pr[G=g] ⋅ Pr[P=p]

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Pairwise and Mutual Independence

10.1

A

analogous to events

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Independence of Indicators and Events (words)

10.1

A

“for events A,B in same space, their indicator r.v.’s are independent iff the events themselves are independent”

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Independence of Indicators and Events (symbols)

10.1

A

Let A,B be two events in (Ω,Pr)
I_A ⊥ I_B iff A⊥B

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

How do you prove iff statements?

10.1

A

Prove one way
Then prove the other way

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Independence of indicators and events (extension)

10.1

A

I_A ⊥ I_B iff A⊥B
can be extended to mutual independence of an arbitrary number of r.v.’s

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

By def, I_A ⊥ I_B tells us

10.1

A

(I_A = 1) ⊥ (I_B = 1)
(I_A = any #) ⊥ (I_B = any #)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

By def of indicator r.v. we know

10.1

A

(I_A = 1) = A and (I_B = 1) = B

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Expectation of indicator r.v.
E[I_X] =

10.1

A

Pr[I_X = 1] = Pr[X]

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

How many cases does one indicator r.v. have?

10.1

A

0 and 1
or x and ˉx

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

IDD Bernoulli trials “performed independently” assume…

10.1

A

that the Bernoulli r.v. are mutually independent

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Constant r.v.’s are ________ of any r.v.

10.1

A

independent

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Independence of Constant r.v.’s

10.1

A

for any (Ω, Pr) and any r.v. X: Ω→ℝ and any c∈ℝ, the constant r.v. C: Ω→ℝ defined by ∀w∈Ω C(w) = c is independent of X

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

What do we know about Val(C) of constant r.v. C:Ω→ℝ

10.1

A

Val(C) = {c} so we also know (C = c) = Ω

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

μ

10.2

A

mean = expectation = E[X]

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

Deviation of X from its mean

10.2

A

X - μ, also a r.v.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

Reminder about a r.v. whose values are constant

10.2

A

has that same constant as its expectation
E[μ] = μ

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

Variance of a r.v.

10.2

A

Var[X] = E[(X-μ)²]
where μ = E[X]

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
20
Q

Standard Deviation of a r.v.

10.2

A

σ[X] = √Var[X]

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
21
Q

μ

10.2

A

Mean

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
22
Q

σ

10.2

A

Standard Deviation

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
23
Q

σ²

10.2

A

Variance

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
24
Q

Variance (alternative formula)

10.2

A

Let X be a r.v. defined on (Ω, Pr)
Var[X] = E[X²] - μ²
where ∀w∈Ω, X²(w) = ((X(w))²)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
25
Q

E[X + 2μ]

10.2

A

By LOE and expectation of constants
= E[X] + 2μ

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
26
Q

if X is an r.v. that returns positive values, then the —– of X² is the values —– with the —- probabilities

10.2

A

distribution
squared
same

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
27
Q

If Val(D)=[1..6], then Val(D²) =

10.2

A

Val(D²) = {1, 4, 9, 16, 25, 36}

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
28
Q

Distribution of # shown by fair die

10.2

A

X = {1, 2, 3, 4, 5, 6} each with probability 1/6

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
29
Q

Bernoulli r.v. Expectation

10.3

A

E[X] = p

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
30
Q

X²(w) = 1 iff

10.3

A

X(w) = 1

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
31
Q

X²(w) = 0 iff

10.3

A

X(w) = 0

32
Q

If X is Bernoulli…

10.3

A

so is X²

33
Q

If X is Bernoulli, then E[X] =

10.3

A

E[X²] = p

34
Q

Variance of Bernoulli r.v. with parameter Pr[X=1] = p

10.3

A

Var[X] = p(1-p)

35
Q

Bernoulli Expectation and Var

10.3

A

different expectation, same variance

Ex: p = 1/3 and q = 2/3

36
Q

Var[cX] =

10.3

A

c² Var[X]

37
Q

c² Var[X] =

10.3

A

Var[cX]

38
Q

if X⊥Y, then Var[X + Y]

10.3

A

Var[X+Y] = Var[X] + Var[Y]

39
Q

Var[X+Y] = Var[X] + Var[Y] iff

10.3

A

E[XY] = E[X] ⋅ E[Y]

40
Q

X⊥Y ⇒

10.3

A

E[XY] = E[X] ⋅ E[Y]

41
Q

Variance for sum of 2 dice in green-purple space

10.3

A

S = G + P and G⊥P, so
Var[S] = Var[G] + Var[P]

42
Q

Binomial r.v. (definition)

10.4

A

parameters: n∈ℕ and p∈[0,1]
Val(B) = [0..n] and ∀k ∈ [0..n]
Pr[B=k] = (n choose k) p^k (1-p)^n-k

43
Q

Binomial r.v. with parameters:
n∈ℕ and p∈[0,1]

10.4

A

Val(B) = [0..n] and
∀k ∈ [0..n] Pr[B=k] = (n choose k) p^k (1-p)^n-k

44
Q

Distribution of Binomial r.v. B

10.4

A

also called binomial with parameters n and p

45
Q

Example of binomial r.v.

10.4

A

probability of event “k successes observed” during n IDD Bernoulli trials with probabilty of success p and B r.v. that returns number of successes observed

46
Q

What are expectation and variance completely determined by?

10.4

A

the distribution

47
Q

— and — are completely determined by the distribution

10.4

A

expectation and variance

48
Q

Don’t forget: indicator r.v.’s are

10.4

A

BERNOULLI!

49
Q

reminder: expectation of indicator r.v.
E[I_A] =

10.4

A

Pr[A] = p

50
Q

Expectation of Binomial r.v.

10.4

A

E[B] = np

51
Q

Pairwise Independence and Variance

10.4

A

If r.v.’s X1,…,Xn are pairwise independent,
then Var[X1+…+Xn] = Var[X1] +…+ Var[Xn]

52
Q

If r.v.’s X1,…,Xn are pairwise independent, then

10.4

A

Var[X1+…+Xn] = Var[X1] +…+ Var[Xn]

53
Q

If the events are mutually independent, then their indicator r.v.’s…

10.4

A

are also mutually independent

54
Q

Reminder: variance of Bernoulli r.v.’s with parameter p =

10.4

A

p(1-p)

55
Q

Variance for Binomial r.v.

10.4

A

Var[B] = np(1-p)

56
Q

X,Y are two r.v.’s on (Ω, Pr), their product, XY, is the r.v.

Self-Paced

A

(XY)(w) = X(w) ⋅ Y(w)
∀w∈Ω

57
Q

Product of two r.v.’s

Self-Paced

A

(XY)(w) = X(w) ⋅ Y(w)
∀w∈Ω

58
Q

Product of two r.v.’s (words)

Self-Paced

A

For every outcome ω in the sample space, you multiply the value that X maps ω to
by the value that Y maps ω to
in order to obtain the value that the product XY maps ω to

59
Q

X_H * X_T(HH) =

Self-Paced

A

X_H * X_T(TT) = 0

60
Q

X_H * X_T(HT) =

Self-Paced

A

X_H * X_T(TH) = 1

61
Q

X_H * X_T is a…

Self-Paced

A

Bernoulli r.v. with parameter 1/2

62
Q

Correlated r.v.’s

Self-Paced

A

X,Y r.v. on (Ω,Pr)
E[XY] ≠ E[X] ⋅ E[Y]

63
Q

Uncorrelated r.v.’s

Self-Paced

A

X,Y r.v. on (Ω,Pr)
E[XY] = E[X] ⋅ E[Y]

64
Q

Variance distributes over the sum of — r.v.’s

Self-Paced

A

uncorrelated

65
Q

Negative Correlation

Self-Paced

A

Higher A means lower B

66
Q

Covariance

Self-Paced

A

Cov(X,Y) = E[XY] - E[X]E[Y]

67
Q

Cov(X,Y) =

Self-Paced

A

E[XY] - E[X]E[Y]

68
Q

two r.v.’s are — iff their covariance is —

Self-Paced

A

uncorrelated, 0

69
Q

for two r.v.’s:
all — r.v.’s are —
but not all — r.v.’s are —-

Self-Paced

A

independent, uncorrelated
uncorrelated, independent

70
Q

Correlation is a measure of — —

Self-Paced

A

linear dependence

71
Q

If A⊥B
Pr[A|B] =
Pr[B|A] =

Self-Paced

A

Pr[A|B] = A
Pr[B|A] = B

72
Q

Alt Var[X+Y] =

Self-Paced

A

Var[X+Y] = Var[X] + Var[Y] - 2(E[XY] - E[X]E[Y])

73
Q

If X1,…,Xn are pairwise uncorrelated, then:
Var[X1+…+Xn] =

Self-Paced

A

Var[X1] +…+ Var[Xn]

74
Q

Expectation of a product of two r.v.’s
E[XY] =

Self-Paced

A

x∈Val(X)Σ y∈Val(Y)Σ Pr[(X=x)∩(Y=y)] ⋅ x ⋅ y

75
Q

Intersection and Independence,
If X⊥Y

A

Pr[(X=x)∩(Y=y)] = Pr[X=x] ⋅ Pr[Y=y]

76
Q

What 3 things do you need to do in order to use binomial r.v.?

A
  1. make it clear what the n Bernoulli trials are
  2. explain why they are IID
  3. specify p