Probability And Statistics Equations Flashcards
Axioms of Probability
- P(A)>=0 for all A is a member of F
- P(omega)=1
- If A1, A2, … are members of F and are mutually exclusive P(A1 U A2 U…) = P(A1) + P(A2) + …
Complement rule
P(Ac)= 1- P(A)
Probability of the Union of Two Events Rule
P(AUB) = P(A) + P(B) - P(AmetszetB)
Bounds on Probabilities Rule
P(AUB) =< P(A) + P(B)
Logical Consequence Rule
If B logically entails A, then P(A)>= P(B)
Conditional Probability
P(AIB) = P(AmetszetB)/P(B)
Axioms revised based on conditional probability
- 0=< P(AIB) =< 1
- P(BIB) = 1
- If A1, A2, … are mutually exclusive given B, then P(A1, A2, … IB) = P(A1IB) + P(A2IB)…
Law of Total Probability
P(A) = P(AmetszetB1) + P(AmetszetB2) = P(AIB1)P(B) + P(AIB2)P(B)
Two events are independent if
P(AmetszetB) = P(A)P(B) OR P(AIB)= P(A)
Bayes’ Rule
P(AIB) = P(BIA)P(A)/P(B)
PMF definition
- f(x) = P(X=x)
- f(x) >= 0
- SUMf(x)=1
PMF of N independent Bermoulli (coin toss) variables with parameter p
f(x) = Kp^x(1-p)^N-x
K is a scaling factor independent of N or p
PDF definition
f(x) = lim(h->0) P(x=< X =< x+h)/h
1. f(x) >= 0
2. Integral fx dx = 1
Uniform distribution U(a,b) PDF
If a=<x=<b f(x) = 1/b-a
Otherwise f(x) = 0
Normal Distribution PDF
f(x) = 1/sigma*gyokalatt2pi exp(-1/2 (x-nu)^2/sigma^2)
CDF Definition
F(x) = P(X=<x) for all x in support of X
1. 0=<F(x)=<1
2. F(x) is a non-decreasing function in x
3.P(X>x) = 1 - P(X=<x)
4. P(a<x=<b) = F(b) - F(a)
Uniform Distribution CDF
If x<a F(x)= 0
If x is a member of Ia,bI F(x)=x-a/b-a
If b<x F(x)=1
Connection of CFD and PDF
f(x)= dF(x)/dx
Standardization N(0,1) Standard Normal Distribution
z=(X-nu)/sigma
P(X=<z) in SN Table
Joint CDF
F(x,y) = P(X=<x, Y=<y)
Joint PDF/PMF
Discrete f(x,y) = P(X=x, Y=y)
Continous f(x, y) = d^2F(x,y)/dxdy
Marginal PDF (of x)
f(x) = f(x, whatever the value of y)
If X and Y are independent (relationship between joint PDF and product of marginals)
f(x,y) = f(x)g(x) - joint PDF = product of marginals
F(x,y)= F(x)G(y) - joint CDF = product of marginals
Conditional PDF, CDF
Slices (impact of variation in one variable on the probability of another)
Expected value E[X]
Discrete E[X]=SUMi xif(xi)
Continous E[X]=integralxf(x)dx
Probability weighted sums of possible values of x
Uniform Expected value - U(a,b)
E[X]=(a+b)/2
Normal - Expected Value N(nu, sigma^2)
E[X]=nu
Expected value of a function of a variable
Discrete E[g(X)] = SUMi g(xi)f(xi)
Continuous E[g(X)] = integral g(x)f(x)dx
Affine functions (E[a+bX])
E[a+bX]=a+bE[X]
Addition of expectations g(x) and h(y) is a function of another or the same variable
E[g(x)+h(y)]=E[g(x)] + E[h(y)]
Multiplication of expectations IF INDEPENDENT
E[g(x)h(y)]= E[g(x)]E[h(y)]
Jensen’s inequality
If f is concave E[f(X)] =< f(E[X])
If f is convex E[f(X)] >= f(E[X])
Variance
Var(X) = E([X-E(X)]^2) = E[X^2] - (E[X])^2
Variance with nu (expected value of the variable)
X: discrete Var(X) = SUMi(xi-nu)^2f(x)
X: continous Var(X) = integral (X - nu)^2f(x) dx
Standard Deviation
SD = gyok alatt Var(X)
Variance of Uniform
Var(X) = (b-a)^2/12
Variance of Normal
Var(X) = sigma^2
Variance of Binary Variable
Var(X)=p(1-p)
Conditional Expectation
Y discrete E(YIX=x) = SUMi yi f(yiIX)
Y continous E(YIX=x) = integral yf(yIx)dy
If E[h(X)YIX] =
Conditioning on X —> same as if it was a constant (linear transformation)
E[h(X)YIX] = h(X)E[YIX]