Foundations of probability and statistics Flashcards

1
Q

Properties of Event spaces

A
  1. contain the empty set
  2. To be closed under complementation
  3. Closed under finite unions
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

De morgans law

A

(A∩B)^c =A^c ∪ B^c

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Probability measure

A

A probability measure is a function which assigns a numerical value to an event
P : F -> [0,1]
A -> P(A)
(The probability of the entire sample space is 1)
(If A1, A2,… are disjoint in F (mutually exclusive) P(U∞i=1(Ai)) = ∑P(Ai))

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Conditional probability

A

PB : F -> [0,1]
A -> P(A|B)
such that P(A|B) = P(A ∩ B)/P(B)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Continuity of probability measures

A

Let A1 ⊂ A2 ⊂ … be an expanding series of events
and A = U(An)
then P(A) = limn->oo P(An)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Borel event space

A

The borel event space over R is the collection of all countable unions and intersections and the complements of open sets
Open sets can be expressed as countable union of closed sets (a,b) = U[a+1/n,b-1/n]

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Random variable

A

A random variable is any fucntion that gives a numerical value to any outcome in a sample space

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Inverse image

A

the set of outcomes which give a value in B such that
X^-1(B) = {w: X(w) in B}

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Support

A

The set of values where the random variable has non-zero probability
For a transformation of a RV g(X) = Y
supp(fY ) = {g(x) : x ∊ supp(fX )}

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

PDF and CDF of Injective transformations

A

If Y = g(X) is injective then
PDF: fY (y) = fX[g^-1(y)]|d/dy (g^-1(y)|
CDF: FY(y) = FX[g^-1(y)] if increasing and 1- FX[g^-1(y)] if decreasing

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Injective transformations (special case)

A

if Y = g(x) = a + bX
PDF: fY(y) = 1/|b| fX((y-a)/b)
CDF: FY(y) = FX((y-a)/b) if b>0 and 1 - FX((y-a)/b) if b<0

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

pseudo-random number

A
  1. obtained a uniformly distributed pseudo-random number u ∊ [0,1] such that u = F(x) and x = F^-1(u)
  2. The number x = F^-1(u) is a pseudo-random number from distribution F
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Expectation of probability distributions

A

For a discrete distribution:
E[X] = ∑ (xi p(xi))
For continuous distributions:
E[X] = ∫ (x f(x))

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Variance and Expectation of transformed variables

A

Var[X] = E[X^2] - E[X]^2
E[g(X)] = ∑ g(xi) f(xi) when X is discrete
E[g(x)] = ∫ g(xi) f(xi) when X is continuous

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

signed variables and expectation

A

signed variables exist where the expected values of a function may cancel out where the area under a function may have posotive and negative parts
X = X+ + X-
E[X] = E[X+] + E[X-]

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Law of total expectation

A

E[Y] = E[E[Y|X]

17
Q

Product moment of X and Y

A

E[XY] = ∫ ∫ xyF(x,y) dxdy

18
Q

Covariance of X and Y

A

Cov[X,Y] = E[XY] - E[X]E[Y]

19
Q

Correlation coefficient of X and Y

A

ρ[X,Y] = Cov[X,Y]/√(Var[X]Var[Y])

20
Q

X and Y independent?

A

X and Y are independent if
F(X,Y) = F(X)F(Y)
or
f(X,Y) = f(X)f(Y)

21
Q

Conditional CDF

A

F(Y|X) = F(X,Y)/F(X)

22
Q

Conditional PDF

A

f(Y|X) = f(X,Y)/f(X)

23
Q

Law of total probability

A

f(Y) = ∫ f(Y|X) f(X) dx

24
Q

Law of Total Variance

A

Var(Y |X) = E(Y^2|X) E(Y |X)^2
such that
Var(Y ) = E[Var(Y |X)] + Var[E(Y |X)]

25
Q

Conditional expectation

A

E[Y|X] = ∫ y f(y|x) dy

26
Q

Power generating functions (PGF)

A

GX(t) = E[t^X] = ∑ pkt^k
where pk is the probability P(X=k) and t is a real or complex number within the region

27
Q

Moment generating functions (MGF)

A

MX(t) = E[exp(tx)]

28
Q

Central limit theorem

A

Let X be a random variable with mean µ and variance 𝜎^2
Sn = ∑ Xi
E[Sn] = nµ and Var[Sn] = n𝜎^2
the dist. is
(Sn - E[Sn])/√ Var(Sn) or equi. 1/√n ( ∑ (Xi - µ)/ 𝜎)
standard normal.

29
Q

CLT distributions

A

Sn = ∑ Xi ~ N(nµ,n𝜎^2) when n is large mean and variance inc so it loses its shape
Zn = 1/√n ∑ Xi ~ N(√n µ, 𝜎^2) when n is large it keeps shape
X⁻n = 1/n ∑ Xi ~ N( µ, 𝜎^2/n) when n is large loses shape low variance high mean law of large numbers.

30
Q

Joint likelihood functions

A

L(θ ; x) = ∏f(xi,θ) where is the PMF or PDF of X

31
Q

log-likelihood functions

A

l(θ ; x) = ∑logL(θ ; x) = ∑logf(xi;θ)
take l’(θ) to determine MLE

32
Q

Baye’s theorem

A

P(Aj|B) = P(B|Aj)P(Aj)/∑P(B|Ak)P(Ak) where P(B) = ∑P(B|Ak)P(Ak)

33
Q

Posterior distribution

A

initial estimate = π0(θ)
posterior dist = π1(θ|X) = f(x|θ)π0(θ)/∫or∑ f(x|θ)π0(θ)

34
Q

MAP estimate

A

The MAP estimate is the mode of the posterior distribution such that θ̂ = argmax[π1(θ|X)] where argmax is the value of x opposed to f(x) in max