Probability B // Flashcards

1
Q

Discrete distribution

A

A r.v. X has a discrete distribution if there exists a finite or countably infinite set of values {x1, x2, … } and corresponding probabilities P1, P2, … satisfying Σ(i) Pi = 1, and
P(X = xi) = Pi for every i
P(X ∈ ℝ \ {x1, x2, … }) = 0

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Probability mass function of a discrete distribution

A

Given the set X = {x1, x2, …} where P(X = xi) = Pi > 0 for all i (the support of the distribution), the pmf is the function fx: X → (0,1], fx(xi) = Pi for every i

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Continuous distribution

A

A r.v. X has a cts distribution if there exists a function fx: ℝ → [0, ∞) with ∫(-∞, ∞) fx(x) dx = 1 such that P(a ≤ X ≤ b) = ∫(a, b) fx(x) dx for all a ≤ b (fx is the probability density function)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Cummulative distribution function of X (discrete)

A

A function Fx: ℝ → [0,1] where Fx(u) = P(X ≤ u) = Σ(i=1, xi ≤ u) fx(xi)

fx(x) = Fx(x) - Fx(x-) for x in support,
where Fx(x-) = lim (h→0) Fx(x-h)
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Cumulative distribution function of X (cts)

A

A function Fx: ℝ → [0,1] where Fx(u) = P(X ≤ u) = ∫(-∞, u) fx(x)dx

fx(x) = (d/dx)Fx(x) where Fx diffble at x

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Rule for finding prob density of Y given Y = g(X), X cts (+ many restrictions on g)

A

If X has prob density fn fx, then

fy(y) = fx(g⁻¹(y))*(d/dy)g⁻¹(y)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Joint distribution of X and Y

A

If X and Y are a pair of r.vs defined on a sample space (measured in the same expt), the joint dist is the probability measure on ℝ² that assigns to a subset B ⊂ ℝ² the probability P((X,Y) ∈ B)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Discrete joint distribution and mass function

A

A joint distribution is discrete if there exists countable or finite sets {x1, x2, …} and {y1, y2, …} such that P(X ∈ {x1, x2, …} and Y ∈ {y1, y2, …}) = 1
The joint mass function is fxy(xi,yj) = P(X = xi and Y = yj)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Cts joint distribution and density function

A

A joint distribution is continuous if there exists a joint density function fxy: ℝ² → [0,∞) so that
P(a ≤ X ≤ b and c ≤ Y ≤ d)
= ∫(a,b)∫(c,d) fxy(x,y) dydx
= ∫(c,d)∫(a,b) fxy(x,y) dxdy
(ie volume under fxy(x,y) over [a,b]x[c,d]

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Independence

A

X and Y are independent if P(X ∈ B and Y ∈ B’) = P(X ∈ B)P(Y ∈ B’) for arbitrary subsets B, B’ ∈ ℝ,
or X and Y are independent if and only if
Fxy(x,y) = Fx(x)*Fy(y)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Convolution of discrete X and Y (dist of X + Y)

A

If X and Y are independent
P(X+Y=m) = Σ(k∈ℤ) P(X=k)P(Y=m-k)
f_x+y(m) = Σ(k∈ℤ) fx(k)fy(m-k)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Convolution of continuous X and Y (dist of X + Y)

A

If X and Y are independent

f_x+y(z) = ∫(-∞,∞) fx(x)fy(z-x)dx = ∫(-∞,∞) fx(z-y)fy(y)dy for z ∈ ℝ

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Convolution of binomial r.vs

A

If X and Y are two independent rvs with binomial distributions, having parameters n,p and m,p respectively, then X + Y has a binomial dist with parameters m+n and p

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Expectation for discrete distribution

A
If X has a discrete dist, support {x1,x2,...} and mass fn fx(xi)=P(X=xi)
then E(X) = Σ(i)xi*fx(xi)
provided Σ|xi|fx(xi) < ∞
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Expectation for cts distribution

A

If X has a cts dist with density fx, then
E(X) = ∫(-∞,∞) x*fx(x)dx
provided ∫(-∞,∞) |x|fx(x)dx < ∞

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Expectations of functions of rvs

A
Let Y = g(X)
If X has a discrete distribution with support {x1,x2,...}
E[g(X)] = Σ(i)g(xi)fx(xi)
If X has a cts dist, 
E[g(X)] = ∫(-∞,∞) g(x)fx(x)dx
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

Expectation for functions of a discrete joint dist

A

Given X,Y are a pair of rvs with respective supports {x1,x2,…} and {y1,y2,…}
E[g(x,y)] = Σ(i,j) g(xi,yj) fxy(xi,yj)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

Expectation for functions of a cts joint dist

A

E[g(X,Y)] = ∫(-∞,∞) ∫(-∞,∞) g(x,y) fxy(x,y)dxdy

19
Q

Properties of expectation (1-3)

A
Positivity
a) if P(Z ≥ 0) = 1, then E(Z) ≥ 0
b) if P(Z1 ≥ Z2) = 1, then E(Z1) ≥ E(Z2)
Expectation of constants
If P(Z=c) = 1 where c ∈ ℝ is nonrandom, then E(Z) = c
Linearity
If X1,...,Xn are rvs, a1,...,an ∈ ℝ constants,
E[Σ(i=1, n) aiXi] = Σ(i=1, n) aiE[Xi]
20
Q

Fubini property of expectation

A

If X1,…,Xn are mutually independent then

E[∏(i=1, n)Xi] = ∏(i=1, n) E[Xi]

21
Q

Cauchy-Schwartz inequality for expectation

A

If X and Y are two rvs then

|E[XY]| ≤ √(E[X²]*E[Y²]), with equality only if P(X = cY) = 1 for some c ∈ ℝ

22
Q

Markov’s inequality

A

If X is a nonnegative random variable and a > 0,

P(X ≥ a) ≤ E(X)/a

23
Q

Variance of the distribution of X

A

E[(X - μ)²] = E[X²] - (E[X])²

24
Q

(Nearly) linear property of variance

A

Var(Σ(i=1, n) ai*Xi) = Σ(i=1, n) ai²Var(Xi)

25
Q

Covariance of two rvs X and Y

A
cov(X,Y) = E[(X - μx)(Y - uy)] = E[XY] - E[X]E[Y] 
(μx = E[X], μy = E[Y])
26
Q

The variance of a convolution of X and Y

A

var(X+Y) = var(X) + var(Y) - 2cov(X,Y)

27
Q

Correlation of U and V

A

cov(U,V)/√(Var(U)*Var(V)) ∈ [-1,1]

28
Q

Expectation and variance of sums of independent variables

A

If X1, X2,…, Xn are independent, identically distributed rvs with mean μ and variance σ². If Sn = Σ(i=1, n) Xi, then
E[Sn] = nμ
Var(Sn) = σ²n

29
Q

Law of large numbers

A

Let Sn = Σ(i=1, n) Xi with X1, X2,…, Xn independent rvs having common mean μ and variance σ², then for ε > 0
P(|Sn/n - μ| ≥ ε) → 0 as n → ∞

30
Q

Central limit theorem

A

Suppose Sn = Σ(i=1, n) Xi where Xi are independent, identically distributed rvs with common mean μ and variance σ². Then for any a

31
Q

Random process

A

A collection of rvs describing the evolution of a system in time

32
Q

The memoryless property of a random variable

A

If T1, T2, … is a sequence of independent rvs, for any t ≥ 0, a ≤ b,
P(T1 ∈ [t+a,t+b] | T1 ≤ t) = P(T1 ∈ [a,b])

33
Q

Poisson process of intensity α>0

A

A collection of rvs Nt indexed by real t ≥ 0 satisfying:
1) N0 = 0
2) Nt - Ns has a poisson dist with parameter α(t-s) for any 0 ≤ s ≤ t
3) for any 0 ≤ t1 ≤ t2 ≤ … ≤ tn, the rvs Nt2-Nt1, Nt3 - Nt2, Nt4 - Nt3,… are independent
α is the mean number of occurrences per unit time

34
Q

Distribution of arrival times of a poisson process

A

If T1,T2,T3,… are the arrival times of a poisson process with intensity α, then T1, T2 - T1, T3 - T2 are independent rvs each having the exponential dist with parameter α

35
Q

Brownian motion

A

Fix μ and σ²
Suppose (Bt, t ≥ 0) is a a family of rvs with the properties
1) B0 = 0
2) Bt - Bs has the Gaussian dist with mean μ(t-s) and variance σ²(t-s)
3) if t1 ≤ t2 ≤ … ≤ tn the rvs Bt2 - Bt1, Bt3 - Bt2,… are independent

36
Q

Moment generating function φx of an rv X

A
φx(t) = E[exp(tX)] for all t for which the expectation exists
If X is discrete,
φx(t) = Σ(i=1, ∞)e^(t*xi)*fx(xi)
If X is continuous,
φx(t) = ∫(-∞,∞) e^(tx)*fx(x) dx
37
Q

Uniqueness theorem for mgfs

A

If X and Y are two random variables with mgfs φx and φy both existing in some neighbourhood (-c,c) of zero, and
φx(t) = φy(t) for all t ∈ (-ε, ε)
then X and Y have the same distribution

38
Q

Determining the mean of a distribution from its mgf

A

E[Xⁿ] = (dⁿ/dtⁿ)φx(0)

39
Q

Gaussian pdf

A

e^(-(x-μ)²/(2σ²))/(√(2π)σ)

for σ > 0, μ ∈ ℝ

40
Q

Gamma pdf

A

αⁿxⁿ⁻¹*e^(-αx)/(n-1)! if x ≥ 0
0 otherwise
for α > 0, n ∈ ℕ

Note the exponential distribution is when n = 1

41
Q

Poisson pmf

A

e^-λ*λ^k/k!, k ∈ {0,1,2,…}

for λ > 0

42
Q

Binomial pmf

A

nCkp^k(1-p)^(n-k), k ∈ {0,1,2,…n}

for p ∈ [0,1], n ∈ ℕ

43
Q

Geometric pmf

A

p(1-p)^k, k ∈ {0,1,2,3,…}

for p ∈ (0,1]

44
Q

Uniform pdf

A

(b - a)⁻¹, if a ≤ x ≤ b
0, otherwise
for a < b