Ch1 Probability Fundamentals Flashcards

1
Q

Discrete probability characterized by:

A

pmf = p(x)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

pmf = p(x) satisfies :

A

Kolmogorov Axioms

  1. p(x) = Pr(X=x)
  2. p(x) ge 0 for all x
  3. sum (p(x)) =1 over all x
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

cdf =

A

cumulative distribution function

F(x) = Pr(X le x)

F(x) = sum(p(X le x))

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

E[X]=

A

sum(x*p(x)) over all x (continuous case use integrals)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

E[X]= words

A

mean X, long run average, 1st absolute moment

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Var(X)=

A

= E[(X - E[X] )^2) ]

= E[X^2] - E[X]^2

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Var(X)= (computational form)

A

E[X^2] - E[X]^2

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q
A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q
A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q
A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q
A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q
A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q
A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

rth absolute moment

A

E[Xr]

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

rth central moment

A

E[(X-µ)r]

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Var(X)=

A

σ2 = E[(X-µ)2]

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q
A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q
A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

marginal pmf

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
20
Q

R.V.’s are independent iff

A
21
Q
A
22
Q

Cov(X,Y)=

A
23
Q
A
24
Q
A
25
Q
A
26
Q

Corr(X,Y)=

A
27
Q

Bernoulli distribution

A
28
Q

Binomial distribution

A
29
Q

Poisson distribution

A
30
Q

Normal distribution

A
31
Q
A
32
Q
A
33
Q

CLT

A
34
Q

normal approximation to binomial

A
35
Q

continuity correction for continuous approx to discrete distribution

eg. normal approx to binomial

A
36
Q

chi-square

A
37
Q

t-distribution

A
38
Q
A
39
Q

Ô is an unbiased estimator of O iff

A

E[Ô] = O

40
Q

Ô is a weakly consistent estimator of O iff

A

for any small positive constant, €,

Pr( |Ô - O| < € ) → 1 as n→inf

also called convergence in probability

Ô→P O

41
Q

IF B(Ô) → 0 and Var(Ô)→0 as n→inf

A

then Ô is a consitent estimator of O

42
Q

MSE(Ô) =

A

E[( Ô - O )2] = Var(Ô) + Bias2(Ô)

43
Q

Relative efficiency

A

R.E. = RE(S1, S2) = MSE(S2) / MSE(S1)

44
Q

RE(X, Y) < 1

A

X is less efficient than Y

45
Q

Markov’s inequality

A

Pr( X > a) le E[X]/a

46
Q

Chebyshev’s inequality (words)

A

is the theorem most often used in stats. It states that no more than 1/k2 of a distribution’s values are more than “k” standard deviations away from the mean

47
Q

Pr(|X-A|=>KY)

A

Pr(|X-A|=>KY)<=1/K2,

The absolute value of the difference of X minus A is greater than or equal to the K times Y has the probability of less than or equal to one divided by K squared.

48
Q

Slutsky’s theorem

A

IF Ô1P O1

and

Ô2P O2

Then sum and product also converge

49
Q

E[S2]=

Var[S2]=

A

remember (n-1)/σ2 * S2 = chi-squared (df = n-1)

E[S2] = σ2

Var(S2) = σ4 / (n-1)2